Word embedding modern methods

Бесплатный доступ

Word embedding, or vector representation of words, is derived from teacher-less machine learning and has recently become widely used for solving various natural language processing tasks. In this approach, each word corresponds to a particular set of meaningful numerical parameters. Because options convey the meaning, it becomes possible to use them instead of the text view. These vectors can be used as features for different applications, such as information retrieval tasks, documents’ classification, answers to questions, extracting named entity and text parsing.

Glove, word2vec, word embedding

Короткий адрес: https://sciup.org/140270827

IDR: 140270827

Статья научная