Efficient convolutional neural networks on Raspberry Pi for image classification
With the good performance of deep learning in the field of computer vision (CV), the convolutional neural network (CNN) architectures have become main backbones of image recognition tasks. With the widespread use of mobile devices, neural network models based on platforms with low computing power ar...
Saved in:
Published in | Journal of real-time image processing Vol. 20; no. 2; p. 21 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Berlin/Heidelberg
Springer Berlin Heidelberg
01.04.2023
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
ISSN | 1861-8200 1861-8219 |
DOI | 10.1007/s11554-023-01271-1 |
Cover
Summary: | With the good performance of deep learning in the field of computer vision (CV), the convolutional neural network (CNN) architectures have become main backbones of image recognition tasks. With the widespread use of mobile devices, neural network models based on platforms with low computing power are gradually being paid attention. However, due to the limitation of computing power, deep learning algorithms are usually not available on mobile devices. This paper proposes a lightweight convolutional neural network TripleNet, which can operate easily on Raspberry Pi. Adopted from the concept of block connections in ThreshNet, the newly proposed network model compresses and accelerates the network model, reduces the amount of parameters of the network, and shortens the inference time of each image while ensuring the accuracy. Our proposed TripleNet and other State-of-the-Art (SOTA) neural networks perform image classification experiments with the CIFAR-10 and SVHN datasets on Raspberry Pi. The experimental results show that, compared with GhostNet, MobileNet, ThreshNet, EfficientNet, and HarDNet, the inference time of TripleNet per image is shortened by 15%, 16%, 17%, 24%, and 30%, respectively. The detail codes of this work are available at
https://github.com/RuiyangJu/TripleNet
. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1861-8200 1861-8219 |
DOI: | 10.1007/s11554-023-01271-1 |