Optimizing Weights in Elman Recurrent Neural Networks with Wolf Search Algorithm

This paper presents a Metahybrid algorithm that consists of the dual combination of Wolf Search (WS) and Elman Recurrent Neural Network (ERNN). ERNN is one of the most efficient feed forward neural network learning algorithm. Since ERNN uses gradient descent technique during the training process; th...

Full description

Saved in:
Bibliographic Details
Published inAdvances in intelligent systems and computing Vol. 549; pp. 11 - 20
Main Authors Nawi, Nazri Mohd, Rehman, M. Z., Hamid, Norhamreeza Abdul, Khan, Abdullah, Naseem, Rashid, Uddin, Jamal
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2016
Springer International Publishing
SeriesAdvances in Intelligent Systems and Computing
Subjects
Online AccessGet full text
ISBN9783319512792
331951279X
ISSN2194-5357
2194-5365
2194-5365
DOI10.1007/978-3-319-51281-5_2

Cover

More Information
Summary:This paper presents a Metahybrid algorithm that consists of the dual combination of Wolf Search (WS) and Elman Recurrent Neural Network (ERNN). ERNN is one of the most efficient feed forward neural network learning algorithm. Since ERNN uses gradient descent technique during the training process; therefore, it is not devoid of local minima and slow convergence problem. This paper used a new metaheuristic search algorithm, called wolf search (WS) based on wolf’s predatory behavior to train the weights in ERNN to achieve faster convergence and to avoid the local minima. The performance of the proposed Metahybrid Wolf Search Elman Recurrent Neural Network (WRNN) is compared with Bat with back propagation (Bat-BP) algorithm and other hybrid variants on benchmark classification datasets. The simulation results show that the proposed Metahybrid WRNN algorithm has better performance in terms of CPU time, accuracy and MSE than the other algorithms.
ISBN:9783319512792
331951279X
ISSN:2194-5357
2194-5365
2194-5365
DOI:10.1007/978-3-319-51281-5_2