• Article
  • | |
  • Metrics
  • |
  • Reference [14]
  • |
  • Related [20]
  • |
  • Cited by [2]
  • | |
  • Comments
    Abstract:

    An on-line structure-learning algorithm of belief network is proposed. The basic idea is to incrementally update the structure and parameters of a belief network after each group of data samples is received. The algorithm consists of two steps. The first step is to update the current belief network based on newly received data samples using incremental updating rules, including parameter incremental updating rule and three structure incremental updating rules, which are adding edge, deleting edge and reverting edge. The second step is to use the result selection criterion to select the most appropriate result from the set of candidates resulted by the first step. The selection criterion fulfills the desire to balance the consistency of the result with the newly received data against the distance between the result and the previous model. Experimental results show that the algorithm can efficiently perform on-line learning of belief network structure. Since on-line learning does not need history data and can adapt to the variation of the problem domain, this algorthm is suitable to model those domains that vary with time.

    Reference
    [1] Pearl, J. Fusion, propagation, and structuring in belief networks. Artificial Intelligence, 1986,29:241~248.
    [2] Heckerman, D. A tutorial on learning with Bayesian network. Technical Report, MSR TR-95-06, Redmond, WA: Microsoft Research, 1995.
    [3] Kass, R., Tierney, L., Kadane, J. Asymptotics in Bayesian computation. In: Bernardo, J., DeGroot, M., Lindley, D., eds. Bayesian Statistics 3, Oxford University Press, 1998. 261~278.
    [4] Russell, S., Binder, J., Koller, D., et al. Local learning in probabilistic networks with hidden variables. In: Proceedings of the 14th International Joint Conference on Artificial Intelligence (IJCAI'95). Montreal: Morgan Kaufmann, 1995. 1146~1 152.
    [5] Binder, J., Koller, D., Russell, S.J., et al. Adaptive probabilistic networks with hidden variables. Machine Learning, 1997,29:213~244.
    [6] Dempster, A.P., Laird, N.M., Rubin, D.B. Maximum-Likelihood from incomplete data via the EM Algorithm. Journal of the Royal Statistical Society (Series B), 1977, 39:1~38.
    [7] Cooper, G., Herskovits, E. A Bayesian method for the induction of probabilistic networks from data. Machine Learning,1992,9:309~347.
    [8] Madigan, D., Raftery, A. Model selection and accounting for model uncertainty in graphical models using Occam's window.Journal of the American Statistical Association, 1994,89:1535~1546.
    [9] Dawid, P. Statistical theory. The prequential approach (with discussion). Journal of the Royal Statistical Society A, 1984,147:278~290.
    [10] Schwarz, G. Estimating the dimension of a model. The Annals of Statistics, 1978,7(2):461~464.
    [11] Allen, T.V., Greiner, R. Comparing model selection criteria for belief networks. In: Proceedings of the 17th International Conference on Machine Learning. San Francisco: Morgan Kaufmann Publishers, 2000. 1047~1054.
    [12] Friedman, N. The bayesian struetural EM algorithm. In: Proceedings of the 14th Conference on Uncertainty in AI (UAI'98). San Francisco: Morgan Kaufmann Publishers, 1998. 129~138.
    [13] Bauer, E., Koller, D., Singer, Y. Update rules for parameter estimation in Bayesian networks. In: Proceedings of the 13th Annual Conference on Uncertainty in Artificial Intelligence (UAI'97). 1997.3~13.
    [14] Spiegelhalter, D., Lauritzen, S. Sequential updating of conditional probabilities on directed graphical structures. Networks,1990,20:570~605.
    Comments
    Comments
    分享到微博
    Submit
Get Citation

刘启元,张聪,沈一栋,汪成亮.信度网结构在线学习算法.软件学报,2002,13(12):2297-2304

Copy
Share
Article Metrics
  • Abstract:3632
  • PDF: 5292
  • HTML: 0
  • Cited by: 0
History
  • Received:February 20,2001
  • Revised:April 24,2001
You are the first2032800Visitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063