n gram language models

Tài liệu Báo cáo khoa học: "Improved Smoothing for N-gram Language Models Based on Ordinary Counts" doc

Tài liệu Báo cáo khoa học: "Improved Smoothing for N-gram Language Models Based on Ordinary Counts" doc

... new method eliminating most of the gap between Kneser-Ney and those methods. 1 Introduction Statistical language models are potentially useful for any language technology task that produces natural -language ... currently be the best approach when language models based on ordinary counts are desired. References Chen, Stanley F., and Joshua Goodman. 1998. An empirical study of smoothing techniques for language ... a given context cannot be increased just by discounting observed counts, as long as all N- grams with the same count receive the same dis- count. Interpolation models address quantization error...

Ngày tải lên: 20/02/2014, 09:20

4 365 0
Báo cáo khoa học: "Faster and Smaller N -Gram Language Models" pptx

Báo cáo khoa học: "Faster and Smaller N -Gram Language Models" pptx

... Each node in the tree encodes a word, and paths in the tree correspond to n- grams in the collection. Tries ensure that each n- gram prefix is represented only once, and are very efficient when n- grams ... Methods in Natural Language Processing. Marcello Federico and Mauro Cettolo. 2007. Efficient handling of n- gram language models for statistical ma- chine translation. In Proceedings of the Second Work- shop ... Catalog Number LDC2006T13. Thorsten Brants, Ashok C. Popat, Peng Xu, Franz J. Och, and Jeffrey Dean. 2007. Large language models in machine translation. In Proceedings of the Conference on Empirical...

Ngày tải lên: 07/03/2014, 22:20

10 463 0
Tài liệu Báo cáo khoa học: "A Succinct N-gram Language Model" ppt

Tài liệu Báo cáo khoa học: "A Succinct N-gram Language Model" ppt

... Compressing trigram language models with Golomb coding. In Proc. of EMNLP-CoNLL 2007. O. Delpratt, N. Rahman, and R. Raman. 2006. Engi- neering the LOUDS succinct tree representation. In Proc. ... N- gram counts. By using 8-bit floating point quantization 1 , N -gram language models are com- pressed into 10 GB, which is comparable to a lossy representation (Talbot and Brants, 2008). 2 N -gram ... N -gram com- pression tasks achieved a significant com- pression rate without any loss. 1 Introduction There has been an increase in available N -gram data and a large amount of web-scaled N- gram data...

Ngày tải lên: 20/02/2014, 09:20

4 458 0
Báo cáo khoa học: "Enhancing Language Models in Statistical Machine Translation with Backward N-grams and Mutual Information Triggers" ppt

Báo cáo khoa học: "Enhancing Language Models in Statistical Machine Translation with Backward N-grams and Mutual Information Triggers" ppt

... language mod- els in machine translation. In Proceedings of the 2007 Joint Conference on Empirical Methods in Nat- ural Language Processing and Computational Natu- ral Language Learning (EMNLP-CoNLL), ... from training data to enhance conventional n- gram lan- guage models and extend their ability to capture richer contexts and long-distance dependencies. In particular, we integrate backward n- grams and ... information (MI) triggers into language models in SMT. In conventional n- gram language models, we look at the preceding n − 1 words when calculating the probability of the current word. We henceforth...

Ngày tải lên: 07/03/2014, 22:20

10 415 0
Tài liệu Báo cáo khoa học: "Creating Robust Supervised Classifiers via Web-Scale N-gram Data" pdf

Tài liệu Báo cáo khoa học: "Creating Robust Supervised Classifiers via Web-Scale N-gram Data" pdf

... Workshop on Natural Language Generation. Natalia N. Modjeska, Katja Markert, and Malvina Nis- sim. 2003. Using the Web in machine learning for other-anaphora resolution. In EMNLP. Preslav Nakov and ... Search engine statistics beyond the n- gram: Application to noun compound bracketing. In CoNLL. Preslav Ivanov Nakov. 2007. Using the Web as an Im- plicit Training Set: Application to Noun Compound Syntax ... Large-scale supervised models for noun phrase bracketing. In PACLING. Xiaofeng Yang, Jian Su, and Chew Lim Tan. 2005. Improving pronoun resolution using statistics-based semantic compatibility information. In ACL. 874 be...

Ngày tải lên: 20/02/2014, 04:20

10 359 0
Tài liệu Báo cáo khoa học: "Incremental Syntactic Language Models for Phrase-based Translation" pptx

Tài liệu Báo cáo khoa học: "Incremental Syntactic Language Models for Phrase-based Translation" pptx

... 2009. Quadratic-time dependency parsing for machine trans- lation. In Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th Interna- tional Joint Conference on Natural Language ... Language modeling with tree substitution grammars. In NIPS workshop on Grammar Induction, Representation of Language, and Language Learning. Arjen Poutsma. 1998. Data-oriented translation. In Ninth ... Henderson. 2004. Lookahead in deterministic left-corner parsing. In Proceedings of the Workshop on Incremental Parsing: Bringing Engineering and Cognition Together, pages 26–33. Liang Huang and...

Ngày tải lên: 20/02/2014, 04:20

12 511 0
Tài liệu Báo cáo khoa học: "The impact of language models and loss functions on repair disfluency detection" pptx

Tài liệu Báo cáo khoa học: "The impact of language models and loss functions on repair disfluency detection" pptx

... of Training Corpus Size on Classifier Performance for Natural Language Processing. In Proceedings of the First International Conference on Human Language Technology Research. Eugene Charniak and Mark ... impact of language models and loss functions on repair disfluency detection Simon Zwarts and Mark Johnson Centre for Language Technology Macquarie University {simon.zwarts|mark.johnson|}@mq.edu.au Abstract Unrehearsed ... 2002. SRILM - An Extensible Lan- guage Modeling Toolkit. In Proceedings of the Inter- national Conference on Spoken Language Processing, pages 901–904. Qi Zhang, Fuliang Weng, and Zhe Feng. 2006....

Ngày tải lên: 20/02/2014, 04:20

9 610 0
Tài liệu Báo cáo khoa học: "An Empirical Investigation of Discounting in Cross-Domain Language Models" ppt

Tài liệu Báo cáo khoa học: "An Empirical Investigation of Discounting in Cross-Domain Language Models" ppt

... since frequent n- grams will appear in more of the re- jected sentences, and nonuniform discounting over n- grams of each count, since the sentences are cho- sen according to a likelihood criterion. Although we ... median (1.0) are very different. constant function of n- gram counts. In Figure 2, we investigate the second assumption, namely that the distribution over discounts for a given n- gram count is ... diver- gence yields higher discounting. One explanation for the remaining variance is that the trigram dis- count curve depends on the difference between the number of bigram types in the train and...

Ngày tải lên: 20/02/2014, 04:20

6 444 0
Tài liệu Báo cáo khoa học: "Reading Level Assessment Using Support Vector Machines and Statistical Language Models" pdf

Tài liệu Báo cáo khoa học: "Reading Level Assessment Using Support Vector Machines and Statistical Language Models" pdf

... text and “adult” text. This resulted in 12 LM perplexity features per article based on trigram, bigram and unigram LMs trained on Bri- tannica (adult), Britannica Elementary, CNN (adult) and CNN ... based on the observed frequency in a training corpus and smoothed using modified Kneser-Ney smoothing (Chen and Good- man, 1999). We used the SRI Language Modeling Toolkit (Stolcke, 2002) for language ... Engineering University of Washington Seattle, WA 98195-2500 mo@ee.washington.edu Abstract Reading proficiency is a fundamen- tal component of language competency. However, finding topical texts at an...

Ngày tải lên: 20/02/2014, 15:20

8 447 0
Tài liệu Báo cáo khoa học: "A Unified Framework for Automatic Evaluation using N-gram Co-Occurrence Statistics" pptx

Tài liệu Báo cáo khoa học: "A Unified Framework for Automatic Evaluation using N-gram Co-Occurrence Statistics" pptx

... ∑∑ ∑∑ ∈∈ ∈∈ = }{),( }{),( )( )( )( CandidatesCnCSngram CandidatesCnCSngram clip ngramCount ngramCount nP where Count(ngram) is the number of n- gram counts, and Count clip (ngram) is the maximum number of co-occurrences of ngram ... ),( )( )( )( ferencesRnRSngram ferencesRnRSngram clip ngramCount ngramCount nR where, as before, Count(ngram) is the number of n- gram counts, and Count clip (ngram) is the maximum number of co-occurrences of ngram in the reference answer and its corresponding ... using ST and eliminating the unigrams found in SW. We therefore define a recall score as: ∑∑ ∑ ∑ ∈∈ ∈∈ = }{Re ),( }{Re ),( )( )( )( ferencesRnRSngram ferencesRnRSngram clip ngramCount ngramCount nR ...

Ngày tải lên: 20/02/2014, 16:20

8 462 0
Tài liệu Báo cáo khoa học: "Generating statistical language models from interpretation grammars in dialogue systems" potx

Tài liệu Báo cáo khoa học: "Generating statistical language models from interpretation grammars in dialogue systems" potx

... Contextual Information and Spe- cific Language Models for Spoken Language Un- derstanding. In Proceedings of SPECOM’97, Cluj- Napoca, Romania, pp. 51–56. Bangalore S. and Johnston M. 2004. Balancing ... of generating Nu- ance recognition grammars from the interpretation grammar and the possibility of generating corpora from the grammars. The interpretation grammar for the domain, written in GF, ... falling off in in-grammar performance. It is interest- ing that the SLM that only models the grammar (MP3GFLM), although being more robust and giv- ing a significant reduction in WER rate, does not degrade...

Ngày tải lên: 22/02/2014, 02:20

8 381 0
Tài liệu Báo cáo khoa học: "Web augmentation of language models for continuous speech recognition of SMS text messages" docx

Tài liệu Báo cáo khoa học: "Web augmentation of language models for continuous speech recognition of SMS text messages" docx

... chunk the in-domain text into n- gram islands” consisting of only content words and excluding frequently occurring stop words. An island such as “stock fund portfolio” is then extended by adding ... On growing and pruning Kneser- Ney smoothed n- gram models. IEEE Transac- tions on Audio, Speech and Language Processing, 15(5):1617–1624. A. Stolcke. 1998. Entropy-based pruning of backoff language ... documents containing an x % relevant n- gram, x % are relevant. When the n- grams have been ranked into a presumed order of relevance, we decide that the most relevant n- gram is 100 % relevant and...

Ngày tải lên: 22/02/2014, 02:20

9 301 0
Báo cáo khoa học: "The use of formal language models in the typology of the morphology of Amerindian languages" potx

Báo cáo khoa học: "The use of formal language models in the typology of the morphology of Amerindian languages" potx

... two native Argentinean languages sive power equal to the noncyclic components of generative grammmars representing the mor- phophonology of natural languages. However, these works make no considerations ... regular grammars for modeling agglutination in this language, but first we will present the for- mer class of languages and its acceptor automata. 3.1 Linear context free languages and two-taped nondeterministic ... natural representa- tion in terms of linear context-free languages. 2 Quichua Santiague ˜ no The quichua santiague˜no is a language of the Quechua language family. It is spoken in the San- tiago del...

Ngày tải lên: 07/03/2014, 22:20

6 439 0
Báo cáo khoa học: "An Efficient Indexer for Large N-Gram Corpora" docx

Báo cáo khoa học: "An Efficient Indexer for Large N-Gram Corpora" docx

... Melbourne, Australia. R. Kneser and H. Ney. 1995. Improved backing-off for n- gram language modeling. In Acoustics, Speech, and Signal Processing, 1995. ICASSP-95., 1995 Interna- tional Conference on, ... corpus of (Brants and Franz, 2006), present a major challenge. The language models built from these sets cannot fit in memory, hence efficient ac- cessing of the N- gram frequencies becomes an is- sue. ... definition, each internal node except the root can have any number of keys in the range [v, 2v], and the root must have at least one key. Finally, an internal node with k keys has k + 1 children. 4.2...

Ngày tải lên: 07/03/2014, 22:20

6 320 0
Báo cáo khoa học: "Randomized Language Models via Perfect Hash Functions" pptx

Báo cáo khoa học: "Randomized Language Models via Perfect Hash Functions" pptx

... Encoding an n -gram s value in the array. function for a given set of n- grams is a significant challenge described in the following sections. 3.3 Encoding n- grams in the model All addresses in ... IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2007, Hawaii, USA. J. Goodman and J. Gao. 2000. Language model size re- duction by pruning and clustering. In ICSLP’00, ... to finding an ordered matching in a bipartite graph whose LHS nodes cor- respond to n- grams in S and RHS nodes correspond to locations in A. The graph initially contains edges from each n- gram to...

Ngày tải lên: 08/03/2014, 01:20

9 273 0

Bạn có muốn tìm thêm với từ khóa:

w