Avoiding Local Minima in Feedforward Neural Networks by Simultaneous Learning

Feedforward neural networks are particularly useful in learning a training dataset without prior knowledge. However, weight adjusting with a gradient descent may result in the local minimum problem. Repeated training with random starting weights is among the popular methods to avoid this problem, bu...

Full description

Saved in:
Bibliographic Details
Published inAI 2007: Advances in Artificial Intelligence Vol. 4830; pp. 100 - 109
Main Authors Atakulreka, Akarachai, Sutivong, Daricha
Format Book Chapter
LanguageEnglish
Published Germany Springer Berlin / Heidelberg 2007
Springer Berlin Heidelberg
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783540769262
3540769269
ISSN0302-9743
1611-3349
DOI10.1007/978-3-540-76928-6_12

Cover

More Information
Summary:Feedforward neural networks are particularly useful in learning a training dataset without prior knowledge. However, weight adjusting with a gradient descent may result in the local minimum problem. Repeated training with random starting weights is among the popular methods to avoid this problem, but it requires extensive computational time. This paper proposes a simultaneous training method with removal criteria to eliminate less promising neural networks, which can decrease the probability of achieving a local minimum while efficiently utilizing resources. The experimental results demonstrate the effectiveness and efficiency of the proposed training method in comparison with conventional training.
ISBN:9783540769262
3540769269
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-540-76928-6_12