面向小样本学习的轻量化知识蒸馏
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

TP18

基金项目:

国家自然科学基金(62106100, 62192783, 62276128); 江苏省自然科学基金(BK20221441); 江苏省双创博士项目(JSSCBS20210021)


Lightweight Knowledge Distillation for Few-shot Learning
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    小样本学习旨在模拟人类基于少数样例快速学习新事物的能力, 对解决样本匮乏情境下的深度学习任务具有重要意义. 但是, 在诸多计算资源有限的现实任务中, 模型规模仍可能限制小样本学习的广泛应用. 这对面向小样本学习的轻量化任务提出了现实的需求. 知识蒸馏作为深度学习领域广泛使用的辅助策略, 通过额外的监督信息实现模型间知识迁移, 在提升模型精度和压缩模型规模方面都有实际应用. 首先验证知识蒸馏策略在小样本学习模型轻量化中的有效性. 并结合小样本学习任务的特点, 针对性地设计两种新的小样本蒸馏方法: (1)基于图像局部特征的蒸馏方法; (2)基于辅助分类器的蒸馏方法. 在miniImageNet和TieredImageNet数据集上的相关实验, 证明所设计的新的蒸馏方法相较于传统知识蒸馏在小样本学习任务上具有显著优越性. 源代码请见https://github.com/cjy97/FSLKD.

    Abstract:

    Few-shot learning aims at simulating the ability of human beings to quickly learn new things with only few samples, which is of great significance for deep learning tasks when samples are limited. However, in many practical tasks with limited computing resources, the model scale may still limit a wider application of few-shot learning. This study presents a realistic requirement for lightweight tasks for few-shot learning. As a widely used auxiliary strategy in deep learning, knowledge distillation transfers knowledge between models by using additional supervised information, which has practical application in both improving model accuracy and reducing model scale. This study first verifies the effectiveness of the knowledge distillation strategy in model lightweight for few-shot learning. Then according to the characteristics of few-shot learning, two new distillation methods for few-shot learning are designed: (1) distillation based on image local features; (2) distillation based on auxiliary classifiers. Experiments on miniImageNet and TieredImageNet datasets demonstrate that the new distillation methods are significantly superior to traditional knowledge distillation in few-shot learning tasks. The source code is available from https://github.com/cjy97/FSLKD.

    参考文献
    相似文献
    引证文献
引用本文

陈嘉言,任东东,李文斌,霍静,高阳.面向小样本学习的轻量化知识蒸馏.软件学报,,():1-16

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2022-11-08
  • 最后修改日期:2022-12-15
  • 录用日期:
  • 在线发布日期: 2023-09-27
  • 出版日期:
您是第位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京市海淀区中关村南四街4号,邮政编码:100190
电话:010-62562563 传真:010-62562533 Email:jos@iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号