Friday , April 26 2024

Enhancing the Generalization Performance of Few-Shot Image Classification with
Self-Knowledge Distillation

Liang LI1, Weidong JIN1,2, Yingkun HUANG3, Junxiao REN1*
1 School of Electrical Engineering, Southwest Jiaotong University, Chengdu 611756, China
liangli@my.swjtu.edu.cn, wdjin@home.swjtu.edu.cn
rjx19910911@my.swjtu.edu.cn (*Corresponding author)
2 ASEAN International Joint Laboratory of Integrated Transportation, Nanning University,
Nanning City, Guangxi Province, China
3 National Supercomputing Center in Shenzhen (Shenzhen Cloud Computing Center),
Shenzhen 518055, China
hykun@live.com

Abstract: Though deep learning has succeeded in various fields, its performance on tasks without a large-scale dataset is always unsatisfactory. The meta-learning based few-shot learning has been used to address the limited data situation. Because of its fast adaptation to the new concepts, meta-learning fully utilizes the prior transferable knowledge to recognize the unseen instances. The general belief is that meta-learning leverages a large quantity of few-shot tasks sampled from the base dataset to quickly adapt the learner to an unseen task. In this paper, the teacher model is distilled to transfer the features using the same architecture. Following the standard-setting in few-shot learning, the proposed model was trained from scratch and the distribution was transferred to a better generalization. Feature similarity matching was proposed to compensate for the inner feature similarities. Besides, the prediction from the teacher model was further corrected in the self-knowledge distillation period. The proposed approach was evaluated on several commonly used benchmarks in few-shot learning and performed best among all prior works.

Keywords: Self-knowledge distillation, Meta-learning, Few-shot learning, Similarity matching.

>>FULL TEXT: PDF

CITE THIS PAPER AS:
Liang LI, Weidong JIN, Yingkun HUANG, Junxiao REN, Enhancing the Generalization Performance of Few-Shot Image Classification with Self-Knowledge Distillation, Studies in Informatics and Control, ISSN 1220-1766, vol. 31(2), pp. 71-80, 2022. https://doi.org/10.24846/v31i2y202207