Online Class Incremental Contrastive Learning Based on Incremental Mixup-induced Universum
Author:
Affiliation:

Clc Number:

TP18

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Online class-increment learning aims to learn new classes effectively under data stream scenarios and guarantee that the model meets the small cache and small batch constraints. However, due to the one-pass nature of data streams, it is difficult for the category information in small batches like offline learning to be exploited by multiple explorations. To alleviate this problem, current studies adopt multiple data augmentation combined with contrastive learning for model training. Nevertheless, considering the limitations of small cache and small batches, existing methods of selecting and storing data randomly are not conducive to obtaining diverse negative samples, which restricts the model discriminability. Previous studies have shown that hard negative samples are the key to improving contrastive learning performance, but this is rarely explored in online learning scenarios. The condued data proposed in traditional Universum learning provides a simple yet intuitive strategy using hard negative samples. Specifically, this study has proposed mixup-induced Universum (MIU) with certain coefficients previously, which effectively improves the performance of offline contrastive learning. Inspired by this, it tries to introduce MIU to online scenes, which is different from the previously statically generated Universum, and data stream scenarios face some additional challenges. Firstly, due to the increasing number of classes, the conventional approach of generating Universum based on globally given classes statically becomes inapplicable, necessitating redefinition and dynamic generation. Therefore, this study proposes to recursively generate MIU with the maximum entropy (incremental MIU, IMIU) relative to the seen (local) class and provides it with an additional small cache to meet the memory limit generally. Secondly, the generated IMIU and positive samples in small batches are mixed up together again to produce diverse and high-quality hard negative samples. Finally, by combining the above steps, the IMIU-based contrastive learning (IUCL) algorithm is developed. Meanwhile, comparison experiments on the standard datasets CIFAR-10, CIFAR-100, and Mini-ImageNet verify the validity of the proposed algorithm.

    Reference
    Related
    Cited by
Get Citation

刘雨薇,陈松灿.基于IMIU的在线类增量对比学习.软件学报,2024,35(12):5544-5557

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:July 25,2023
  • Revised:September 16,2023
  • Adopted:
  • Online: June 12,2024
  • Published: December 06,2024
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063