• Article
  • | |
  • Metrics
  • |
  • Reference [10]
  • |
  • Related [20]
  • |
  • Cited by [15]
  • | |
  • Comments
    Abstract:

    In order to improve the predictive accuracy of inductive learning, a heavy analysis about the demerit of C4.5 is given, and the reason why there are many debates and compromise between standard method and meta algorithms is pointed out. By the method of estimating the probability distribution of training examples, a new and simple method of decision tree is turned out. Experimental results on UCI data sets show that the proposed method has good performance on accuracy issue and faster computing speed than C4.5 algorithm.

    Reference
    [1]Blummer A, Ehrenfeucht A, Haussler D, Warmuth MK. Occam's Razor. Information Processing Letters, 1987,24:377~380.
    [2]Murphy PM, Pazzani MJ. Exploring the decision forest. In: Proceedings of the Computational Learning and Natural Learning Workshop. Provincetown, MA, 1993. 10~12.
    [3]Qualian JR. Induction of decision trees. Machine Learning, 1986,1:81~106.
    [4]Qualian JR. C4.5: Programs for Machine Learning. San Mateo, CA: Morgan Kaufmann Publishers, 1993.
    [5]Freund Y, Schapire RE. Experiments with a new boosting algorithm. In: Proceedings of the 13th International Conference on Machine Learning. San Francisco: Morgan Kaufmann Pulishers, 1996. 148~156.
    [6]Breiman L. Bagging predictors. Machine Learning, 1996,24:123~140.
    [7]Qualian JR. Bagging, boosting, and C4.5. In: Proceedings of the 13th National Conference Artificial Intelligence. Portland, Ore., 1996. 725~730.
    [8]Murthy S, Kasif S, Salzberg S. A system for induction of oblique decision trees. Journal of Artificial Intelligence Research, 1994,2: 1~32.
    [9]Bennett KP, Blue J. Hybird extreme points Tabu search. R.P.I. Math Report, No.240, Troy, New York: Rensselaer Polytechnic Institute, 1996.
    [10]University of California. Irvine repository of machine learning database, obtainable by anonymous FTP. ftp://ics.uci.edu in the /pub/machine-learning-databases directory.
    Comments
    Comments
    分享到微博
    Submit
Get Citation

何劲松,郑浩然,王煦法.从熵均值决策到样本分布决策.软件学报,2003,14(3):479-483

Copy
Share
Article Metrics
  • Abstract:4273
  • PDF: 5596
  • HTML: 0
  • Cited by: 0
History
  • Received:November 14,2001
  • Revised:April 10,2002
You are the first2032674Visitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063