Learning of a single-hidden layer feedforward neural network using an optimized extreme learning machine
This paper proposes a learning framework for single-hidden layer feedforward neural networks (SLFN) called optimized extreme learning machine (O-ELM). In O-ELM, the structure and the parameters of the SLFN are determined using an optimization method. The output weights, like in the batch ELM, are ob...
Saved in:
| Published in | Neurocomputing (Amsterdam) Vol. 129; pp. 428 - 436 |
|---|---|
| Main Authors | , , , |
| Format | Journal Article |
| Language | English |
| Published |
Amsterdam
Elsevier B.V
10.04.2014
Elsevier |
| Subjects | |
| Online Access | Get full text |
| ISSN | 0925-2312 1872-8286 |
| DOI | 10.1016/j.neucom.2013.09.016 |
Cover
| Summary: | This paper proposes a learning framework for single-hidden layer feedforward neural networks (SLFN) called optimized extreme learning machine (O-ELM). In O-ELM, the structure and the parameters of the SLFN are determined using an optimization method. The output weights, like in the batch ELM, are obtained by a least squares algorithm, but using Tikhonov's regularization in order to improve the SLFN performance in the presence of noisy data. The optimization method is used to the set of input variables, the hidden-layer configuration and bias, the input weights and Tikhonov's regularization factor. The proposed framework has been tested with three optimization methods (genetic algorithms, simulated annealing, and differential evolution) over 16 benchmark problems available in public repositories. |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
| ISSN: | 0925-2312 1872-8286 |
| DOI: | 10.1016/j.neucom.2013.09.016 |