Self-Knowledge Distillation Method Using Adaptive Mixed Attention for Image Classification

Model compression using self-knowledge distillation methods has achieved remarkable performance in tasks such as image classification and object detection. However, most current self-knowledge distillation approaches focus on the foreground of the image, overlooking background knowledge, while their...

Full description

Saved in:
Bibliographic Details
Published in2024 20th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD) pp. 1 - 5
Main Authors Zhou, Yiping, Qiu, Zhen, Xiang, Hui, Tao, Jun, Liang, Chong, Guo, Ting
Format Conference Proceeding
LanguageEnglish
Published IEEE 27.07.2024
Subjects
Online AccessGet full text
DOI10.1109/ICNC-FSKD64080.2024.10702197

Cover

More Information
Summary:Model compression using self-knowledge distillation methods has achieved remarkable performance in tasks such as image classification and object detection. However, most current self-knowledge distillation approaches focus on the foreground of the image, overlooking background knowledge, while their attention mechanisms may also focus wrong, leading to the omission of vital subject knowledge. This paper addresses these issues by proposing a self-knowledge distillation method based on an adaptive mixed attention mechanism. The adaptive mixed attention mechanism adjusts attention according to feature maps of the image, allowing for the rational extraction of both foreground and background knowledge. This enhancement improves the accuracy and stability of knowledge extraction, thereby boosting the precision of image classification. Experimental results demonstrate that the method using the ResNet34 network achieves accuracy improvements of 2.48% on the CIFAR100 dataset and 1.21% on the Tiny-ImageNet dataset. For fine-grained visual recognition tasks, the method using the ResNet34 network improves accuracy by 3.69% on the CUB200 dataset and 1.43% on the MIT67 dataset, respectively.
DOI:10.1109/ICNC-FSKD64080.2024.10702197