Confidence-weighted Learning for Feature Evolution
Author:
Affiliation:

  • Article
  • | |
  • Metrics
  • |
  • Reference [31]
  • |
  • Related [20]
  • | | |
  • Comments
    Abstract:

    Compared with traditional online learning for fixed features, feature evolvable learning usually assumes that features would not vanish or appear in an arbitrary way, while the old features and new features gathered by the hardware devices will disappear and emerge at the same time along with the devices exchanging simultaneously. However, the existing feature evolvable algorithms merely utilize the first-order information of data streams, regardless of the second-order information which explores the correlations between features and significantly improves the classification performance. A confidence-weighted learning for feature evolution (CWFE) algorithm is proposed to solve the aforementioned problem. First, second-order confidence-weighted learning for data streams is introduced to update the prediction model. Next, in order to benefit the learned model, linear mapping during the overlap period is learned to recover the old features. Then, the existing model is updated with the recovered old features, and at the same time, a new predictive model is learned with the new features. Furthermore, two ensemble methods are introduced to utilize these two models. Finally, empirical studies show superior performance over state-of-the-art feature evolvable algorithms.

    Reference
    [1] Zhai TT, Gao Y, Zhu JW. Survey of online learning algorithms for streaming data classification. Ruan Jian Xue Bao/Journal of Software, 2020, 31(4): 912-931 (in Chinese with English abstract). http://www.jos.org.cn/1000-9825/5916.htm[doi: 10.13328/ j.cnki.jos.005916]
    [2] Guo HS, Zhang AJ, Wang WJ. Concept drift detection method based on online performance test. Ruan Jian Xue Bao/Journal of Software, 2020, 31(4): 932-947 (in Chinese with English abstract). http://www.jos.org.cn/1000-9825/5917.htm[doi: 10.13328/j. cnki.jos.005917]
    [3] Hou B, Zhang L, Zhou Z. Learning with feature evolvable streams. In: Proc. of the 30th Annual Conf. on Neural Information Processing Systems. 2017. 1417-1427.
    [4] Zinkevich M. Online convex programming and generalized infinitesimal gradient ascent. In: Proc. of the 20th Int’l Conf. on Machine Learning. 2003. 928-936.
    [5] Hou B, Zhang L, Zhou Z. Learning with feature evolvable streams. IEEE Trans. on Knowledge and Data Engineering, 2021, 33(6): 2602-2615.
    [6] Freund Y, Schapire RE. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 1997, 55(1): 119-139.
    [7] Liu YF, Li WB, Gao Y. Passive-aggressive learning with feature evolvable streams. Journal of Computer Research and Development, 2021, 58(8): 1575-1585 (in Chinese with English abstract).
    [8] Crammer K, Dekel O, Keshet J, et al. Online passive-aggressive algorithms. Journal of Machine Learning Research, 2006, 7(3): 551-585.
    [9] Li ZJ, Li YX, Wang F, et al. Online learning algorithms for big data analytics: A survey. Journal of Computer Research and Development, 2015, 52(8): 1707-1721 (in Chinese with English abstract).
    [10] Pan ZS, Tang SQ, Qiu JX, et al. Survey on online learning algorithms. Journal of Data Acquisition and Processing, 2016, 31(6): 1067-1082 (in Chinese with English abstract).
    [11] Hoi S, Wang J, Zhao P. Libol: A library for online learning algorithms. Journal of Machine Learning Research, 2014, 15(1): 495-499.
    [12] Rosenblatt F. The perceptron: A probabilistic model for information storage and organization in the brain. Psychological Review, 1958, 65(6): 386-408.
    [13] Shalev-Shwartz S, Singer Y, Srebro N, et al. Pegasos: Primal estimated sub-gradient solver for SVM. Mathematical Programming, 2011, 127(1): 3-30.
    [14] Crammer K, Dredze M, Pereira F. Exact convex confidence-weighted learning. In: Proc. of the 21st Annual Conf. on Neural Information Processing Systems. 2008. 345-352.
    [15] Crammer K, Kulesza A, Dredze M. Adaptive regularization of weight vectors. In: Proc. of the 22nd Annual Conf. on Neural Information Processing Systems. 2009. 414-422.
    [16] Hoi S, Wang J, Zhao P. Exact soft confidence-weighted learning. In: Proc. of the 29th Int’l Conf. on Machine Learning. 2012. 107-114.
    [17] Hou B, Zhang L, Zhou Z. Prediction with unpredictable feature evolution. IEEE Trans. on Neural Networks and Learning Systems, 2021, 1-10.
    [18] Zhang Z, Zhao P, Jiang Y, et al. Learning with feature and distribution evolvable streams. In: Proc. of the 37th Int’l Conf. on Machine Learning. 2020. 11317-11327.
    [19] Hou C, Zhou Z. One-pass learning with incremental and decremental features. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2018, 40(11): 2776-2792.
    [20] Dong J, Cong Y, Sun G, et al. Evolving metric learning for incremental and decremental features. IEEE Trans. on Circuits and Systems for Video Technology, 2021, 14(8): 1-13.
    [21] Zhang Q, Zhang P, Long G, et al. Towards mining trapezoidal data streams. In: Proc. of the 2015 IEEE Int’l Conf. on Data Mining. 2015. 1111-1116.
    [22] Zhang Q, Zhang P, Long G, et al. Online learning from trapezoidal data streams. IEEE Trans. on Knowledge and Data Engineering, 2016, 28(10): 2709-2723.
    [23] He Y, Wu B, Wu D, et al. Online learning from capricious data streams: A generative approach. In: Proc. of the 28th Int’l Joint Conf. on Artificial Intelligence. 2019. 2491-2497.
    [24] He Y, Wu B, Wu D, et al. Toward mining capricious data streams: A generative approach. IEEE Trans. on Neural Networks and Learning Systems, 2021, 32(3): 1228-1240.
    [25] Beyazit E, Alagurajah J, Wu X. Online learning from data streams with varying feature spaces. In: Proc. of the 33rd AAAI Conf. on Artificial Intelligence. 2019. 3232-3239.
    附中文参考文献:
    [1] 翟婷婷, 高阳, 朱俊武. 面向流数据分类的在线学习综述. 软件学报, 2020, 31(4): 912-931. http://www.jos.org.cn/1000-9825/5916.htm [doi: 10.13328/j.cnki.jos.005916]
    [2] 郭虎升, 张爱娟, 王文剑. 基于在线性能测试的概念漂移检测方法. 软件学报, 2020, 31(4): 932-947. http://www.jos.org.cn/1000-9825/5917.htm [doi: 10.13328/j.cnki.jos.005917]
    [7] 刘艳芳, 李文斌, 高阳. 基于被动-主动的特征演化流学习. 计算机研究与进展, 2021, 58(8): 1575-1585.
    [9] 李志杰, 李元香, 王峰, 等. 面向大数据分析的在线学习算法综述. 计算机研究与发展, 2015, 52(8): 1707-1721.
    [10] 潘志松, 唐斯琪, 邱俊洋, 等. 在线学习算法综述. 数据采集与处理, 2016, 31(6): 1067-1082.
    Cited by
    Comments
    Comments
    分享到微博
    Submit
Get Citation

刘艳芳,李文斌,高阳.特征演化的置信-加权学习方法.软件学报,2022,33(4):1315-1325

Copy
Share
Article Metrics
  • Abstract:1235
  • PDF: 4656
  • HTML: 2570
  • Cited by: 0
History
  • Received:May 31,2021
  • Revised:July 16,2021
  • Online: October 26,2021
  • Published: April 06,2022
You are the first2032470Visitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063