The authors proposed the concept of reduce probability and derive probability. With the modifications on the I/O algorithm, the probability parameters of general context free grammar can be trained from unanalyzed corpus.
1 Baum L E,Eagon J A An inequality with application to statisdcal estimation for probabillstlc functions of Markov process and
to a modelfor ecology Bulletin of the American Mathemmticians Society,1967,73:360~363
2 Merialdo B. Tagging English text with a prohahdisticmodel Computational Linguistics,1994.20(2):155~171
3 Baker J K.Trainable grammarsfor speech recognition In:EtattDH,Wolf J J eds.Speech Communication Papersforthe 97th
Meeting of the Acoustical Society of American 1979.547~550
4 ShihH-H,Young S J,Waeger N P.An inference approachto grammar construction Computer Speech andLanguaguag,l995,9;
235~256
5 Kupiec J Hidden Markov estimation for unrestricted stochastic context-free grammars,In:Proceedings of the International
Conference on ASSP. 1992.1; 177~180
6 Black E,Lafferty J,Roukos S.Development and evaluation of a broad-coverage probabilistic grammars of English language
computer manuals.In:Proceedings of the Association for Computational Linguistics.1992.117~121
7 Lari K,Young S J.The estimation of stochastic cotltext-free grsmmars using the hmide-outside algorithm.Computer Speech
and Language,1990,4; 35~56
8 Chomsky N On certam properties of grammars. Information and Control,1959.2:137~167
9 Tomita M Efficient parsing for natural language.Norwell,Massachusetts,USA:Kluwer Publishers,1986
10 Shi Xiao dong.Machine translation——a practitioner’s approach[Ph D. Thesis]National University of Defence Technolo-
gy,1994
11 Magerman D,Marcus M. Pearl:a probabilistic chart parser In:Proceedings of the 2nd International Workshop on Parsing
Technulvgies.Cancun,Mexican,1991.193~199