Tendencies of development science and practice 330 algorithms for interpreting word vectors


Download 304.03 Kb.
Pdf ko'rish
bet4/6
Sana07.01.2023
Hajmi304.03 Kb.
#1082770
1   2   3   4   5   6
Bog'liq
TENDENCIES-OF-DEVELOPMENT-SCIENCE-AND-PRACTICE-331-339 (1)

METHODS AND APPROACHES 
Overview of four popular NLP models: 


TENDENCIES OF DEVELOPMENT SCIENCE AND PRACTICE 
334 
1-graph.
Timeline highlighting the four described models 
1.Recurrent Neural Network Language Model (RNNLM) 
The idea that arose in 2001 led to the birth of one of the first embedding models. 
Embeddings 
In language modeling, individual words and groups of words are compared to 
vectors - some numerical representations while maintaining the semantic connection. 
A compressed vector representation of a word is called embedding. 
The model takes as input vector representations of n previous words and can 
"understand" the semantics of the sentence. Model training is based on the continuous 
bag of words algorithm. Contextual (neighboring) words are fed to the input of the 
neural network, which predicts the central word. 
Bag of words 
A bag of words is a model for representing text as a vector (set of words). Each 
word in the text is assigned the number of its occurrences. 
The compressed vectors are combined, passed to the hidden layer, where a 
softmax activation function is fired, which determines which signals will pass further 
(if this topic is difficult, read our Illustrative Introduction to Neural Networks). 
One of the tasks of language 
modeling is to predict the next word 
based on knowledge of the previous text. 
This is useful for correcting typos, auto-
completion, chatbots, etc. There is a lot 
of scattered information on the Internet 
about natural language processing 
models. We have collected four popular 
NLP models in one place and compared 
them based on documentation and 
scientific sources. 
The original version was 
based on feed-forward neural 
networks - the signal went strictly 
from the input layer to the output 
one. Later, an alternative was 
proposed in the form of recurrent 
neural networks (RNN) - it was on 
the "vanilla" RNN, and not on 
controlled recurrent units (GRUs) 
or 
long 
short-term 
memory 
(LSTM). 


TENDENCIES OF DEVELOPMENT SCIENCE AND PRACTICE 
335 
Recurrent Neural Networks (RNN) 
Neural networks with directed connections between elements. The output of the neuron 
can be fed back into the input. Such a structure allows you to have a kind of "memory" 
and process data sequences, for example, natural language texts. 

Download 304.03 Kb.

Do'stlaringiz bilan baham:
1   2   3   4   5   6




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling