Selective Boosting Algorithm for Maximizing the Soft Margin
Author:
Affiliation:

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Research of traditional boosting algorithms mainly focuses on maximizing the hard or soft margin of the convex combination among weak hypotheses. The weak learners are often all used in the combination, even though some of them are more, or less related. This increases the time complexity of the hypotheses’ training and test. To ease the redundancies of the base hypotheses, this paper presents a selective boosting algorithm called SelectedBoost for classifying binary labeled samples, which is based on LPBoost. The main idea of the algorithm is to discard as many hypotheses as possible according to their relevance and diversity. Furthermore, this paper introduces an edge constraint for every strong hypothesis to speed up the convergence when maximizing the soft margin of the combination of the weak hypotheses. The experimental results show that this algorithm can achieve both better performance and less generalization error compared to some representative boosting algorithms.

    Reference
    Related
    Cited by
Get Citation

方育柯,傅彦,周俊临,佘莉,孙崇敬.基于选择性集成的最大化软间隔算法.软件学报,2012,23(5):1132-1147

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:September 10,2010
  • Revised:May 18,2011
  • Adopted:
  • Online: April 29,2012
  • Published:
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063