Document Type
Conference Proceeding
Publication Date
2003
Abstract
We describe a unified probabilistic framework for statistical language modeling-the latent maximum entropy principle-which can effectively incorporate various aspects of natural language, such as local word interaction, syntactic structure and semantic document information. Unlike previous work on maximum entropy methods for language modeling, which only allow explicit features to be modeled, our framework also allows relationships over hidden features to be captured, resulting in a more expressive language model. We describe efficient algorithms for marginalization, inference and normalization in our extended models. We then present experimental results for our approach on the Wall Street Journal corpus.
Repository Citation
Wang, S.,
Schuurmans, D.,
Peng, F.,
& Zhao, Y.
(2003). Semantic N-gram Language Modeling with the Latent Maximum Entropy Principle. Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, 1, I-376-I-379.
https://corescholar.libraries.wright.edu/knoesis/279
DOI
10.1109/ICASSP.2003.1198796
Included in
Bioinformatics Commons, Communication Technology and New Media Commons, Databases and Information Systems Commons, OS and Networks Commons, Science and Technology Studies Commons
Comments
This paper was presented at the International Conference on Acoustics, Speech, and Signal Processing (ICASSP) in 2003.