Growing Artificial Neural Networks
Pruning is a legitimate method for reducing the size of a neural network to fit in low SWaP hardware, but the networks must be trained and pruned offline. We propose an algorithm, Artificial Neurogenesis (ANG), that grows rather than prunes the network and enables neural networks to be trained and e...
Saved in:
| Published in | Advances in Artificial Intelligence and Applied Cognitive Computing pp. 409 - 423 |
|---|---|
| Main Authors | , |
| Format | Book Chapter |
| Language | English |
| Published |
Cham
Springer International Publishing
2021
|
| Series | Transactions on Computational Science and Computational Intelligence |
| Subjects | |
| Online Access | Get full text |
| ISBN | 9783030702953 3030702952 |
| ISSN | 2569-7072 2569-7080 |
| DOI | 10.1007/978-3-030-70296-0_31 |
Cover
| Summary: | Pruning is a legitimate method for reducing the size of a neural network to fit in low SWaP hardware, but the networks must be trained and pruned offline. We propose an algorithm, Artificial Neurogenesis (ANG), that grows rather than prunes the network and enables neural networks to be trained and executed in low SWaP embedded hardware. ANG accomplishes this by using the training data to determine critical connections between layers before the actual training takes place. Our experiments use a modified LeNet-5 as a baseline neural network that achieves a test accuracy of 98.74% using a total of 61,160 weights. An ANG grown network achieves a test accuracy of 98.80% with only 21,211 weights. |
|---|---|
| ISBN: | 9783030702953 3030702952 |
| ISSN: | 2569-7072 2569-7080 |
| DOI: | 10.1007/978-3-030-70296-0_31 |