Pruning convolution neural network (squeezenet) using taylor expansion-based criterion
Recent research in the field of deep learning focuses on reducing the model size of the Convolution Neural Network (CNN) by various compression techniques like Pruning, Quantization and Encoding (eg. Huffman encoding). This paper proposes a way to prune the CNN based on Taylor expansion of change in...
Saved in:
| Published in | 2018 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT) pp. 1 - 5 |
|---|---|
| Main Authors | , |
| Format | Conference Proceeding |
| Language | English |
| Published |
IEEE
01.12.2018
|
| Subjects | |
| Online Access | Get full text |
| DOI | 10.1109/ISSPIT.2018.8705095 |
Cover
| Summary: | Recent research in the field of deep learning focuses on reducing the model size of the Convolution Neural Network (CNN) by various compression techniques like Pruning, Quantization and Encoding (eg. Huffman encoding). This paper proposes a way to prune the CNN based on Taylor expansion of change in cost function ΔC of the model. The proposed algorithm uses greedy criteria based pruning with fine-tuning by backpropagation on SqueezeNet architecture. Transfer learning technique is used to train the SqueezeNet on the CIFAR-10 dataset. The proposed algorithm achieves 70% model reduction on SqueezeNet architecture with only 1% drop in accuracy. |
|---|---|
| DOI: | 10.1109/ISSPIT.2018.8705095 |