Survey on Few-shot Learning
Author:
Affiliation:

Fund Project:

National Key Research and Development Program of China; National Natural Science Foundation of China (U1836206, 61772501, 61572473, 61572469)

  • Article
  • | |
  • Metrics
  • |
  • Reference [146]
  • |
  • Related [20]
  • |
  • Cited by
  • | |
  • Comments
    Abstract:

    Few-shot learning is defined as learning models to solve problems from small samples. In recent years, under the trend of training model with big data, machine learning and deep learning have achieved success in many fields. However, in many application scenarios in the real world, there is not a large amount of data or labeled data for model training, and labeling a large number of unlabeled samples will cost a lot of manpower. Therefore, how to use a small number of samples for learning has become a problem that needs to be paid attention to at present. This paper systematically combs the current approaches of few-shot learning. It introduces each kind of corresponding model from the three categories: fine-tune based, data augmentation based, and transfer learning based. Then, the data augmentation based approaches are subdivided into unlabeled data based, data generation based, and feature augmentation based approaches. The transfer learning based approaches are subdivided into metric learning based, meta-learning based, and graph neural network based methods. In the following, the paper summarizes the few-shot datasets and the results in the experiments of the aforementioned models. Next, the paper summarizes the current situation and challenges in few-shot learning. Finally, the future technological development of few-shot learning is prospected.

    Reference
    [1] Li XY, Long SP, Zhu J. Survey of few-shot learning based on deep neural network. Application Research of Computers, 2020, 37(8):2241-2247(in Chinese with English abstract).
    [2] Jankowski N, Duch W, Grąbczewski K. Meta-learning in Computational Intelligence. Springer Science and Business Media, 2011. 97-115.
    [3] Lake B, Salakhutdinov R. One-shot learning by inverting a compositional causal process. In:Advances in Neural Information Processing Systems. 2013. 2526-2534.
    [4] Li FF, et al. A Bayesian approach to unsupervised one-shotlearning of object categories. In:Proc. of the 9th IEEE Int'l Conf. on Computer Vision. 2003.
    [5] Feifei L, Fergus R, Perona P. One-shot learning of object categories. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2006,28(4):594-611.
    [6] Wang YX, Girshick R, Hebert M, et al. Low-shot learning from imaginary data. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2018. 7278-7286.
    [7] Fu Y, Xiang T, Jiang YG, et al. Recent advances in zero-shot recognition:Toward data-efficient understanding of visual content. IEEE Signal Processing Magazine, 2018,35(1):112-125.
    [8] Yang J, Liu YL. The latest advances in face recognition with single training sample. Journal of Xihua University (Natural Science Edition), 2014,33(4):1-5,10(in Chinese with English abstract).
    [9] Snell J, Swersky K, Zemel RS. Prototypical networks for few-shot learning. In:Advances in Neural Information Processing Systems. 2017. 4077-4087.
    [10] Zhang X, Sung F, Qiang Y, et al. Deep comparison:Relation columns for few-shot learning. arXiv preprint arXiv:1811.07100, 2018.
    [11] Finn C, Abbeel P, Levine S. Model-agnostic meta-learning for fast adaptation of deep networks. In:Proc. of the 34th Int'l Conf. on Machine Learning, Vol.70. 2017. 1126-1135.
    [12] Manning C. Foundations of Statistical Natural Language Processing. MIT Press, 1999.
    [13] Bailey K, Chopra S. Few-shot text classification with pre-trained word embeddings and a human in the loop. arXiv preprint arXiv:1804.02063, 2018.
    [14] Howard J, Ruder S. Universal language model fine-tuning for text classification. arXiv preprint arXiv:1801.06146, 2018.
    [15] Xiang J, Havaei M, Chartrand G, et al. On the importance of attention in meta-learning for few-shot text classification. arXiv preprint arXiv:1806.00852, 2018.
    [16] Yu M, Guo X, Yi J, et al. Diverse few-shot text classification with multiple metrics. In:Proc. of the NAACL-HLT. 2018. 1206-1215.
    [17] Vinyals O, Blundell C, Lillicrap T, et al. Matching networks for one shot learning. In:Advances in Neural Information Processing Systems. 2016. 3630-3638.
    [18] Huang ZW, Xie K, Wen C, Sheng GQ, Wen FQ. Small sample face recognition algorithm based on transfer learning model. Journal of Changjiang University (Natural Science Edition), 2019,16(7):88-94(in Chinese with English abstract).
    [19] Hu ZP, He W, Wang M, Sun Z, Ren DW. Deep subspace joined sparse representation for single sample face recognition. Journal of Yanshan University, 2018,42(5):409-415(in Chinese with English abstract).
    [20] Lv YQ, Min WQ, Duan H, Jiang SQ. Few-shot food recognition combining triplet convolutional neural network with relation network. Computer Science, 2020,47(1):136-143(in Chinese with English abstract).
    [21] Lin KZ, Bai JX, Li HT, Li W. Facial expression recognition with small samples fused with different models under deep learning. Computer Science and Exploration, 2020,14(3):482-492(in Chinese with English abstract).
    [22] Zhou TY, Zhao L. Research of handwritten Chinese character recognition model with small dataset based on neural network. Journal of Shandong University of Technology (Natural Science Edition), 2019,33(3):69-74(in Chinese with English abstract).
    [23] Quan ZN, Lin JJ. Text-independent writer identification method based on Chinese handwriting of small samples. Journal of East China University of Science and Technology (Natural Science Edition), 2018,44(6):882-886(in Chinese with English abstract).
    [24] Liu JZ. Small sample bark image recognition method based on convolutional neural network. Journal of Northwest Forestry University, 2019,34(4):230-235(in Chinese with English abstract).
    [25] Liu JM, Meng YL, Wan XY. Cross-task dialog system based on small sample machine learning. Journal of Chongqing University of Posts and Telecommunications (Natural Science Edition), 2019,31(3):299-304(in Chinese with English abstract).
    [26] Upadhyay S, Faruqui M, Tur G, et al. (Almost) Zero-shot cross-lingual spoken language understanding. In:Proc. of the 2018 IEEE Int'l Conf. on Acoustics, Speech and Signal Processing (ICASSP 2018). IEEE, 2018. 6034-6038.
    [27] Lampinen AK, Mcclelland JL. One-shot and few-shot learning of word embeddings. arXiv preprint arXiv:1710.10280, 2017.
    [28] Li CK, Fang J, Wu N, Song JY, Zhou Q, Zhou QL. A road extraction method for high resolution remote sensing image with limited samples. Science of Surveying and Mapping, 2020,45(4):81-88(in Chinese with English abstract).
    [29] Sun CW, Wen C, Xie K, He JB. Voiceprint recognition method of small sample based on deep migration model. Computer Engineering and Design, 2018,39(12):3816-3822(in Chinese with English abstract).
    [30] Zhao Y. Convolutional neural network based carotid plaque recognition over small sample size ultrasound images[MS. Thesis]. Wuhan:Huazhong University of Science and Technology, 2018(in Chinese with English abstract).
    [31] Cheng L, Yuan Q, Wang Y, Feng W, Dai XT. A small sample exploratory study of autogenous bronchial basal cells for the treatment of chronic obstructive pulmonary disease. Chongqing Medical, 2019,48(23):4012-4016(in Chinese with English abstract).
    [32] Zhu FN, Ma Y. Comparison of small sample of local complications between femoral artery sheath removal and vascular closure device. Journal of Modern Integrated Chinese and Western Medicine, 2010,19(14):1748-1820(in Chinese with English abstract).
    [33] Liu YF, Zhou Y, Liu X, Dong F, Wang C, Wang ZH. Wasserstein GAN-based small-sample augmentation for new-generation artificial intelligence:A case study of cancer-staging data in biology. Engineering, 2019,5(1):338-354(in Chinese with English abstract).
    [34] Jia LZ, Qin RR, Chi RX, Wang JH. Evaluation research of nurse's core competence based on a small sample. Medical Higher Vocational Education and Modern Nursing, 2018,1(6):340-342(in Chinese with English abstract).
    [35] Yan B, Zhou P, Yan L. Disease identification of small sample crop based on transfer learning. Modern Agricultural Sciences and Technology, 2019,(6):87-89(in Chinese with English abstract).
    [36] Sun YY, Jiang ZH, Dong W, Zhang LP, Rao Y. Image recognition of tea plant disease based on convolutional neural network and small samples. Jiangsu Journal of Agricultural Sciences, 2019,35(1):48-55(in Chinese with English abstract).
    [37] Wang X, Ma TM, Yang T, Song P, Xie QJ, Chen ZG. Moisture quantitative analysis with small sample set of maize grain in filling stage based on near infrared spectroscopy. Journal of Agricultural Engineering, 2018,34(13):203-210(in Chinese with English abstract).
    [38] He XJ, Ma S, Wu YY, Jiang GR. E-commerce product sales forecast with multi-dimensional index integration under small sample. Computer Engineering and Applications, 2019,55(15):177-184(in Chinese with English abstract).
    [39] Liu XP, Guo B, Cui DJ, Wu ZY, Zhang LJ. Q-precentile life prediction based on bivariate wiener process for gear pump with small sample size. China Mechanical Engineering, 2020,31(11):1315-1322(in Chinese with English abstract).
    [40] Chen L, Zhang F, Jiang S. Deep forest learning for military object recognition under small training set condition. Journal of Chinese Academy of Electronics, 2019,14(3):232-237(in Chinese with English abstract).
    [41] Sun HW, Xie XF, Sun T, Zhang LJ. Threat assessment method of warships formation air defense based on DBN under the condition of small sample data missing. Systems Engineering and Electronics, 2019,41(6):1300-1308(in Chinese with English abstract).
    [42] Nakamura A, Harada T. Revisiting fine-tuning for few-shot learning. arXiv preprint arXiv:1910.00216, 2019.
    [43] Royle JA, Dorazio RM, Link WA. Analysis of multinomial models with unknown index using data augmentation. Journal of Computational and Graphical Statistics, 2007,16(1):67-85.
    [44] Liu JW, Liu Y, Luo XL. Semi-supervised learning methods. Chinese Journal of Computers, 2015,38(8):1592-1617(in Chinese with English abstract).
    [45] Chen WJ. Semi-supervised learning study summary. Academic Exchange, 2011,7(16):3887-3889(in Chinese with English abstract).
    [46] Su FL, Xie QH, Huang QQ, Qiu JY, Yue ZJ. Semi-supervised method for attribute extraction based on transductive learning. Journal of Shandong University (Science Edition), 2016,51(3):111-115(in Chinese with English abstract).
    [47] Tu EM, Yang J. A review of semi-supervised learning theories and recent advances. Journal of Shanghai Jiaotong University, 2018,52(10):1280-1291(in Chinese with English abstract).
    [48] Wang YX, Hebert M. Learning from small sample sets by combining unsupervised meta-training with CNNs. In:Advances in Neural Information Processing Systems. 2016. 244-252.
    [49] Boney R, Ilin A. Semi-supervised few-shot learning with MAMLl. In:Proc. of the ICLR (Workshop). 2018.
    [50] Ren MY, Triantafillou E, Ravi S, et al. Meta-learning for semi-supervised few-shot classification. arXiv preprint arXiv:1803. 00676, 2018.
    [51] Liu Y, Lee J, Park M, et al. Learning to propagate labels:Transductive propagation network for few-shot learning. arXiv preprint arXiv:1805.10002, 2018.
    [52] Hou RB, Chang H, Ma BP, et al. Cross attention network for few-shot classification. In:Advances in Neural Information Processing Systems. 2019. 4003-4014.
    [53] Goodfellow IJ, Pouget-Abadie J, Mirza M, et al. Generative adversarial nets. In:Advances in Neural Information Processing Systems, 2014,27:2672-2680.
    [54] Mehrotra A, Dukkipati A. Generative adversarial residual pairwise networks for one shot learning. arXiv preprint arXiv:1703.08033, 2017.
    [55] Hariharan B, Girshick R. Low-shot visual recognition by shrinking and hallucinating features. In:Proc. of the IEEE Int'l Conf. on Computer Vision. 2017. 3018-3027.
    [56] Wang YX, Girshick R, Hebert M, et al. Low-shot learning from imaginary data. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2018. 7278-7286.
    [57] Xian Y, Sharma S, Schiele B, et al. f-VAEGAN-D2:A feature generating framework for any-shot learning. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2019. 10275-10284.
    [58] Chen Z, Fu Y, Kim YX, et al. Image deformation meta-networks for one-shot learning. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2019. 8680-8689.
    [59] Dixit M, Kwitt R, Niethammer M, et al. AGA:Attribute guided augmentation. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2017. 7455-7463.
    [60] Liu B, Wang X, Dixit M, et al. Feature space transfer for data augmentation. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2018. 9090-9098.
    [61] Schwartz E, Karlinsky L, Shtok J, et al. Delta-encoder:An effective sample synthesis method for few-shot object recognition. In:Advances in Neural Information Processing Systems. 2018. 2845-2855.
    [62] Chen Z, Fu Y, Zhang Y, et al. Semantic feature augmentation in few-shot learning. arXiv preprint arXiv:1804.05298, 2018.
    [63] Shen W, Shi Z, Sun J. Learning from adversarial features for few-shot classification. arXiv preprint arXiv:1903.10225, 2019.
    [64] Liu XP, Luan XD, Xie YX, et al. Transfer learning research and algorithm review. Journal of Changsha University, 2018,32(5):33-36,41(in Chinese with English abstract).
    [65] Wang H. Research review on transfer learning. Computer Knowledge and Technology, 2017,13(32):203-205(in Chinese with English abstract).
    [66] Liu J, Yuan Q, Wu G, Yu X. Review of convolutional neural networks. Computer Era, 2018,(11):19-23(in Chinese with English abstract).
    [67] Yang L, Wu YQ, Wang JL, Liu YL. Research on recurrent neural network. Computer Application, 2018,38(S2):1-6,26(in Chinese with English abstract).
    [68] Wang YX, Hebert M. Learning to learn:Model regression networks for easy small sample learning. In:Proc. of the European Conf. on Computer Vision. Springer Int'l Publishing, 2016.
    [69] Jang YH, Lee HK, Hwang SJ, et al. Learning what and where to transfer. In:Proc. of the Int'l Conf. on Machine Learning. 2019. 3030-3039.
    [70] Shen YY, Yan Y, Wang HZ. Recent advances on supervised distance metric learning algorithms. Acta Automatica Sinica, 2014, 40(12):2673-2686(in Chinese with English abstract).
    [71] Aurélien B, Amaury H, Marc S. A survey on metric learning for feature vectors andstructured data. arXiv preprint arXiv:1306. 6709, 2013.
    [72] Kulis B. Metric learning:A survey. Foundations and Trends® in Machine Learning, 2013,5(4):287-364.
    [73] Weinberger KQ. Distance metric learning for large margin nearest neighbor classification. In:Advances in Neural Information Processing Systems. 2006. 1473-1480.
    [74] Koch G, Zemel R, Salakhutdinov R. Siamese neural networks for one-shot image recognition. In:Proc. of the ICML Deep Learning Workshop. 2015.
    [75] Jiang LB, Zhou XL, Jiang FW, Che L. One-shot learning based on improved matching network. Systems Engineering and Electronics, 2019,41(6):1210-1217(in Chinese with English abstract).
    [76] Wang P, Liu L, Shen C, et al. Multi-attention network for one shot learning. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2017. 2721-2729.
    [77] Gao TY, Han X, Liu ZY, Sun MS. Hybrid attention-based prototypical networks for noisy few-shot relation classification. In:Proc. of the AAAI Conf. on Artificial Intelligence. 2019. 6407-6414.
    [78] Sun SL, Sun QF, Zhou K, Lv TC. Hierarchical attention prototypical networks for few-shot text classification. In:Proc. of the Conf. on Empirical Methods in Natural Language Processing and the 9th Int'l Joint Conf. on Natural Language Processing (EMNLP-IJCNLP). 2019. 476-485.
    [79] Sung F, Yang Y, Zhang L, et al. Learning to compare:Relation network for few-shot learning. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2018. 1199-1208.
    [80] Hilliard N, Phillips L, Howland S, et al. Few-shot learning with metric-agnostic conditional embeddings. arXiv preprint arXiv:1802.04376, 2018.
    [81] Li W, Xu J, Huo J, Wang L, Yang G, Luo J. Distribution consistency based covariance metric networks for few-shot learning. In:Proc. of the AAAI Conf. on Artificial Intelligence. 2019. 8642-8649.
    [82] Li W, Wang L, Xu J, Huo J, Gao Y, Luo J. Revisiting local descriptor based image-to-class measure for few-shot learning. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2019. 7260-7268.
    [83] Li H, Eigen D, Dodge S, et al. Finding task-relevant features for few-shot learning by category traversal. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2019. 1-10.
    [84] Thrun S, Pratt L. Learning to learn:Introduction and overview. In:Learning to Learn. Boston:Springer-Verlag, 1998. 3-17.
    [85] Vilalta R, Drissi Y. A perspective view and survey of meta-learning. Artificial Intelligence Review, 2002,18(2):77-95.
    [86] Ravi S, Larochelle H. Optimization as a model for few-shot learning. In:Proc. of the ICLR. 2016.
    [87] Hochreiter S, Younger AS, Conwell PR. Learning to learn using gradient descent. In:Proc. of the Int'l Conf. on Artificial Neural Networks. Berlin, Heidelberg:Springer-Verlag, 2001. 87-94.
    [88] Santoro A, Bartunov S, Botvinick M, et al. One-shot learning with memory-augmented neural networks. arXiv preprint arXiv:1605.06065, 2016.
    [89] Graves A, Wayne G, Danihelka I. Neural turing machines. arXiv preprint arXiv:1410.5401, 2014.
    [90] Munkhdalai T, Yu H. Meta networks. International Conference on Machine Learning. In:Proc. of the PMLR. 2017. 2554-2563.
    [91] Jamal MA, Qi GJ. Task agnostic meta-learning for few-shot learning. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2019. 11719-11727.
    [92] Sun Q, Liu Y, Chua TS, et al. Meta-transfer learning for few-shot learning. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2019. 403-412.
    [93] Liu Y, Sun Q, Liu AA, et al. LCC:Learning to customize and combine neural networks for few-shot learning. arXiv preprint arXiv:1904.08479, 2019.
    [94] Wang X, Yu F, Wang R, et al. TAFE-Net:Task-aware feature embeddings for low shot learning. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2019. 1831-1840.
    [95] Gidaris S, Komodakis N. Dynamic few-shot visual learning without forgetting. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2018. 4367-4375.
    [96] Zhou J, Cui G, Zhang Z, et al. Graph neural networks:A review of methods and applications. arXiv preprint arXiv:1812.08434, 2018.
    [97] Garcia V, Bruna J. Few-shot learning with graph neural networks. In:Proc. of the Int'l Conf. on Learning Representations. 2018.
    [98] Kim J, Kim T, Kim S, et al. Edge-labeling graph neural network for few-shot learning. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2019. 11-20.
    [99] Gidaris S, Komodakis N. Generating classification weights with GNN denoisingautoencoders for few-shot learning. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2019. 21-30.
    [100] Fort S. Gaussian prototypical networks for few-shot learning on Omniglot. arXiv preprint arXiv:1708.02735, 2017.
    [101] Malalur P, Jaakkola T. Alignment based matching networks for one-shot classification and open-set recognition. arxiv preprint arXiv:1903.06538, 2019.
    [102] Yin C, Feng Z, Lin Y, et al. Fine-grained categorization and dataset bootstrapping using deep metric learning with humans in the loop. In:Proc. of the IEEE conference on computer vision and pattern recognition. 2016. 1153-1162.
    [103] Michael B, Tracy K, Samuel D. Gosling. Amazon's Mechanical Turk. Perspectives on Psychological Science, 2011,6(1):3-5.
    [104] Deng J, Dong W, Socher R, et al. ImageNet:A large-scale hierarchical image database. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. IEEE, 2009. 248-255.
    [105] Wang K, Liu BS. Research review on text classification. Data Communication, 2019,(3):37-47(in Chinese with English abstract).
    [106] Long M, Zhu H, Wang J, et al. Unsupervised domain adaptation with residual transfer networks. In:Proc. of the 30th Int'l Conf. on Neural Information Processing Systems. 2016. 136-144.
    [107] Koh PW, Liang P. Understanding black-box predictions via influence functions. In:Proc. of the Int'l Conf. on Machine Learning. 2017. 1885-1894.
    [108] Han X, Zhu H, Yu P, et al. FewRel:A large-scale supervised few-shot relation classification dataset with state-of-the-art evaluation. In:Proc. of the 2018 Conf. on Empirical Methods in Natural Language Processing. 2018. 4803-4809.
    [109] Cohn DA, Ghahramani Z, Jordan MI. Active learning with statistical models. Journal of Artificial Intelligence Research, 1996, 4(1):705-712.
    [110] Kaelbling LP, Littman ML, Moore AP. Reinforcement learning:A survey. Journal of Artificial Intelligence Research, 1996,4:237-285.
    附中文参考文献:
    [1] 李新叶,龙慎鹏,朱婧.基于深度神经网络的少样本学习综述.计算机应用研究,2020,37(8):2241-2247.
    [8] 杨军,刘妍丽.基于图像的单样本人脸识别研究进展.西华大学学报(自然科学版),2014,33(4):1-5,10.
    [18] 黄振文,谢凯,文畅,盛冠群,文方青.迁移学习模型下的小样本人脸识别算法.长江大学学报(自然科学版),2019,16(7):88-94.
    [19] 胡正平,何薇,王蒙,孙哲,任大伟.深度子空间联合稀疏表示单样本人脸识别算法.燕山大学学报,2018,42(5):409-415.
    [20] 吕永强,闵巍庆,段华,蒋树强.融合三元卷积神经网络与关系网络的小样本食品图像识别.计算机科学,2020,47(1):136-143.
    [21] 林克正,白婧轩,李昊天,李骜.深度学习下融合不同模型的小样本表情识别.计算机科学与探索,2020,14(3):482-492
    [22] 周添一,赵磊.基于神经网络的小样本手写汉字识别.山东理工大学学报(自然科学版),2019,33(3):69-74.
    [23] 全志楠,林家骏.文本无关的小样本手写汉字笔迹鉴别方法.华东理工大学学报(自然科学版),2018,44(6):882-886.
    [24] 刘嘉政.基于卷积神经网络的小样本树皮图像识别方法.西北林学院学报,2019,34(4):230-235.
    [25] 刘继明,孟亚磊,万晓榆.基于小样本机器学习的跨任务对话系统.重庆邮电大学学报(自然科学版),2019,31(3):299-304.
    [28] 李朝奎,方军,吴馁,宋璟毓,周倩,周青蓝.一种针对小样本的高分辨率遥感影像道路提取方法.测绘科学,2020,45(4):81-88.
    [29] 孙存威,文畅,谢凯,贺建飚.深度迁移模型下的小样本声纹识别方法.计算机工程与设计,2018,39(12):3816-3822.
    [30] 赵媛.基于卷积神经网络的小样本颈动脉超声斑块图像识别方法研究[硕士学位论文].武汉:华中科技大学,2018.
    [31] 程林,袁慊,王瑜,冯伟,孙凤军,戴晓天.自体支气管基底层细胞治疗慢性阻塞性肺疾病的小样本探索性研究.重庆医学,2019, 48(23):4012-4016.
    [32] 朱方年,马野.拔除股动脉鞘管手法压迫与应用血管闭合装置的局部并发症小样本比较.现代中西医结合杂志,2010,19(14):1748-1820.
    [33] 刘宇飞,周源,刘欣,董放,王畅,王子鸿.基于Wasserstein GAN的新一代人工智能小样本数据增强方法——以生物领域癌症分期数据为例.工程,2019,5(1):338-354.
    [34] 贾玲芝,秦蓉蓉,迟荣香,王吉华.基于小样本的护士核心能力评估研究.医药高职教育与现代护理,2018,1(6):340-342.
    [35] 燕斌,周鹏,严利.基于迁移学习的小样本农作物病害识别.现代农业科技,2019,(6):87-89.
    [36] 孙云云,江朝晖,董伟,张立平,饶元,李绍稳.基于卷积神经网络和小样本的茶树病害图像识别.江苏农业学报,2019,35(1):48-55.
    [37] 王雪,马铁民,杨涛,宋平,谢秋菊,陈争光.基于近红外光谱的灌浆期玉米籽粒水分小样本定量分析.农业工程学报,2018,34(13):203-210.
    [38] 何喜军,马珊,武玉英,蒋国瑞.小样本下多维指标融合的电商产品销量预测.计算机工程与应用,2019,55(15):177-184.
    [39] 刘小平,郭斌,崔德军,吴振宇,张立杰.基于二元维纳过程的小样本齿轮泵可靠寿命预测.中国机械工程, 2020,31(11):1315-1322.
    [40] 陈龙,张峰,蒋升.小样本条件下基于深度森林学习模型的典型军事目标识别方法.中国电子科学研究院学报,2019,14(3):232-237.
    [41] 孙海文,谢晓方,孙涛,张龙杰.小样本数据缺失状态下DBN舰艇编队防空目标威胁评估方法.系统工程与电子技术,2019,41(6):1300-1308.
    [44] 刘建伟,刘媛,罗雄麟.半监督学习方法.计算机学报,2015,38(8):1592-1617.
    [45] 陈武锦.半监督学习研究综述.电脑知识与技术,2011,7(16):3887-3889.
    [46] 苏丰龙,谢庆华,黄清泉,邱继远,岳振军.基于直推式学习的半监督属性抽取.山东大学学报(理学版),2016,51(3):111-115.
    [47] 屠恩美,杨杰.半监督学习理论及其研究进展概述.上海交通大学学报,2018,52(10):1280-1291.
    [64] 刘鑫鹏,栾悉道,谢毓湘,等.迁移学习研究和算法综述.长沙大学学报,2018,32(5):33-36,41.
    [65] 王惠.迁移学习研究综述.电脑知识与技术,2017,13(32):203-205.
    [66] 刘健,袁谦,吴广,喻晓.卷积神经网络综述.计算机时代,2018,(11):19-23.
    [67] 杨丽,吴雨茜,王俊丽,刘义理.循环神经网络研究综述.计算机应用,2018,38(S2):1-6,26.
    [70] 沈媛媛,严严,王菡子.有监督的距离度量学习算法研究进展.自动化学报,2014,40(12):2673-2686.
    [75] 蒋留兵,周小龙,姜风伟,车俐.基于改进匹配网络的单样本学习.系统工程与电子技术,2019,41(6):1210-1217.
    [105] 汪岿,刘柏嵩.文本分类研究综述.数据通信,2019,(3):37-47.
    Cited by
    Comments
    Comments
    分享到微博
    Submit
Get Citation

赵凯琳,靳小龙,王元卓.小样本学习研究综述.软件学报,2021,32(2):349-369

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:October 09,2019
  • Revised:January 01,2020
  • Online: September 10,2020
  • Published: February 06,2021
You are the first2033332Visitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063