Avoiding Local Minima in Feedforward Neural Networks by Simultaneous Learning
Feedforward neural networks are particularly useful in learning a training dataset without prior knowledge. However, weight adjusting with a gradient descent may result in the local minimum problem. Repeated training with random starting weights is among the popular methods to avoid this problem, bu...
Saved in:
| Published in | AI 2007: Advances in Artificial Intelligence Vol. 4830; pp. 100 - 109 |
|---|---|
| Main Authors | , |
| Format | Book Chapter |
| Language | English |
| Published |
Germany
Springer Berlin / Heidelberg
2007
Springer Berlin Heidelberg |
| Series | Lecture Notes in Computer Science |
| Subjects | |
| Online Access | Get full text |
| ISBN | 9783540769262 3540769269 |
| ISSN | 0302-9743 1611-3349 |
| DOI | 10.1007/978-3-540-76928-6_12 |
Cover
| Summary: | Feedforward neural networks are particularly useful in learning a training dataset without prior knowledge. However, weight adjusting with a gradient descent may result in the local minimum problem. Repeated training with random starting weights is among the popular methods to avoid this problem, but it requires extensive computational time. This paper proposes a simultaneous training method with removal criteria to eliminate less promising neural networks, which can decrease the probability of achieving a local minimum while efficiently utilizing resources. The experimental results demonstrate the effectiveness and efficiency of the proposed training method in comparison with conventional training. |
|---|---|
| ISBN: | 9783540769262 3540769269 |
| ISSN: | 0302-9743 1611-3349 |
| DOI: | 10.1007/978-3-540-76928-6_12 |