Growing Artificial Neural Networks

Pruning is a legitimate method for reducing the size of a neural network to fit in low SWaP hardware, but the networks must be trained and pruned offline. We propose an algorithm, Artificial Neurogenesis (ANG), that grows rather than prunes the network and enables neural networks to be trained and e...

Full description

Saved in:
Bibliographic Details
Published inAdvances in Artificial Intelligence and Applied Cognitive Computing pp. 409 - 423
Main Authors Mixter, John, Akoglu, Ali
Format Book Chapter
LanguageEnglish
Published Cham Springer International Publishing 2021
SeriesTransactions on Computational Science and Computational Intelligence
Subjects
Online AccessGet full text
ISBN9783030702953
3030702952
ISSN2569-7072
2569-7080
DOI10.1007/978-3-030-70296-0_31

Cover

More Information
Summary:Pruning is a legitimate method for reducing the size of a neural network to fit in low SWaP hardware, but the networks must be trained and pruned offline. We propose an algorithm, Artificial Neurogenesis (ANG), that grows rather than prunes the network and enables neural networks to be trained and executed in low SWaP embedded hardware. ANG accomplishes this by using the training data to determine critical connections between layers before the actual training takes place. Our experiments use a modified LeNet-5 as a baseline neural network that achieves a test accuracy of 98.74% using a total of 61,160 weights. An ANG grown network achieves a test accuracy of 98.80% with only 21,211 weights.
ISBN:9783030702953
3030702952
ISSN:2569-7072
2569-7080
DOI:10.1007/978-3-030-70296-0_31