Reference
1. J. R. Bellegarda, “Statistical language model adaptation:
Review and
perspectives,” vol. 42, no. 1, pp. 93–108, 2004.
2. Y.-Y. Wang, M. Mahajan, and X. Huang, “A unified context-free grammar and
n-gram model for spoken language processing,”
in IEEE International
Conference on Acoustics, Speech,
and Signal Processing, vol. III, (Istanbul,
Turkey), pp. 1639–1642, Institute of Electrical and Electronics Engineers, Inc.,
2000 L.
TENDENCIES OF DEVELOPMENT SCIENCE AND PRACTICE
338
3. Zhou and D. Zhang, “NLPIR: a theoretical framework for applying natural
language processing
to information retrieval,” J. Am. Soc. Inf. Sci. Technol.,
vol. 54, no. 2, pp. 115–123, 2003 .
4. Tomas Mikolov, Kai Chen, Greg Corrado, Jeffrey Dean, Efficient Estimation of
Word Representations in Vector Space (2013),
International Conference on
Learning Representations.
5. Yoshua Bengio, Réjean Ducharme, Pascal Vincent, Christian Jauvin, A Neural
Probabilistic Language Model (2003), Journal of Machine Learning Research.
6. https://proglib.io/p/obzor-chetyreh-populyarnyh-nlp-modeley-2020-04-21.
7.
https://towardsdatascience.com/a-no-frills-guide-to-most-natural-language-
processing-models-part-1-the-pre-lstm-ice-age-86055dd5d67c
8.