Constrained State-Preserved Extreme Learning Machine

Document Type

Article

Publication Date

11-1-2019

Find this in a Library

Catalog Record

Abstract

Reducing the training time for neural networks is a primary focus of research in the field of machine learning. Currently, the Levenberg-Marquardt (LM) method is one of the fastest backpropagation methods. A recently popular alternative to LM backpropagation is the Extreme Learning Machine (ELM), which produces a closed form optimization of a Single Layer Feed Forward Network for an initially randomized input weight matrix. In this study, we further extend the performance of an ELM by incrementally building on the state of the art, the State-Preserved ELM (SPELM), to produce a Constrained SPELM (CSPELM). To do so, we introduce a constraint, ', which randomly perturbs the input weight matrix after each training cycle, providing a honing mechanism during the search for a better local optimum. We evaluated CSPELM against 13 benchmark datasets, both categorical and continuous. For 8 of the 13 benchmark datasets, CSPELM outperformed, with respect to average accuracy and RMSE, the ELM, SPELM, and LM methods. Further, the results show that in 8 of the 13 benchmark datasets used, CSPELM was the best performing model and only reached a maximum of 195.10 seconds total training time in one example. The results show a more consistent and higher accuracy than the ELM and SPELM and competitive or better results with LM with training time being only approximately 10% of traditional LM backpropagation training time.

DOI

10.1109/ICTAI.2019.00109

Catalog Record

Share

COinS