Adaptive hybrid optimization for backpropagation neural networks in image classification

Image classification is essential in artificial intelligence, with applications in medical diagnostics, autonomous navigation, and industrial automation. Traditional training methods like stochastic gradient descent (SGD) often suffer from slow convergence and local minima. This research presents a...

Full description

Saved in:
Bibliographic Details
Published inProceedings of the Nigerian Society of Physical Sciences Vol. 2; no. 1; p. 150
Main Authors Essang, Samuel, Okeke, Stephen, Ante Effiong, Jackson, Francis, Runyi, Fadugba, Sunday E., Ogbaji Otobi, Augustine, Auta, Jonathan T., Chukwuka, Chikwe F., Ogar-Abang, Michael O., Moses, Aigberemhon
Format Journal Article
LanguageEnglish
Published FLAYOO PUBLISHING HOUSE LIMITED 11.03.2025
Subjects
Online AccessGet full text
ISSN1115-5876
1115-5876
DOI10.61298/pnspsc.2025.2.150

Cover

More Information
Summary:Image classification is essential in artificial intelligence, with applications in medical diagnostics, autonomous navigation, and industrial automation. Traditional training methods like stochastic gradient descent (SGD) often suffer from slow convergence and local minima. This research presents a hybrid Particle Swarm Optimization (PSO)-Genetic Algorithm (GA)-Backpropagation framework to enhance neural network training. By integrating AdaGrad and PSO for weight optimization, GA for refinement, and backpropagation for fine-tuning, the model improves performance. Results show a 97.5% accuracy on MNIST, a 5% improvement over Adam, and 40% faster convergence than SGD. This approach enhances efficiency, accuracy, and generalization, making it valuable for high-dimensional AI tasks.
ISSN:1115-5876
1115-5876
DOI:10.61298/pnspsc.2025.2.150