WEIGHTS OPTIMIZATION OF NEURAL NETWORK VIA IMPROVED BCO APPROACH

Feed forward neural Network (FNN) has been widely applied to many fields because of its ability to closely approximate unknown function to any degree of desired accuracy. Back Propagation (BP) is the most general learning algorithms, but is subject to local optimal convergence and poor performance e...

Full description

Saved in:
Bibliographic Details
Published inElectromagnetic waves (Cambridge, Mass.) Vol. 83; pp. 185 - 198
Main Authors Zhang, Yu-Dong, Wu, Lenan
Format Journal Article
LanguageEnglish
Published Cambridge Electromagnetics Academy 2008
Online AccessGet full text
ISSN1559-8985
1070-4698
1559-8985
DOI10.2528/PIER08051403

Cover

More Information
Summary:Feed forward neural Network (FNN) has been widely applied to many fields because of its ability to closely approximate unknown function to any degree of desired accuracy. Back Propagation (BP) is the most general learning algorithms, but is subject to local optimal convergence and poor performance even on simple problems when forecasting out of samples. Thus, we proposed an improved Bacterial Chemotaxis Optimization (BCO) approach as a possible alternative to the problematic BP algorithm, along with a novel adaptive search strategy to improve the efficiency of the traditional BCO. Taking the classical XOR problem and sinc function approximation as examples, comparisons were implemented. The results demonstrate that our algorithm is obviously superior in convergence rate and precision compared with other training algorithms, such as Genetic Algorithm (GA) and Taboo Search (TS).
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1559-8985
1070-4698
1559-8985
DOI:10.2528/PIER08051403