Word embedding modern methods
Автор: Shelonik A.A., Koldobskiy V.I.
Журнал: Теория и практика современной науки @modern-j
Рубрика: Математика, информатика и инженерия
Статья в выпуске: 2 (20), 2017 года.
Бесплатный доступ
Word embedding, or vector representation of words, is derived from teacher-less machine learning and has recently become widely used for solving various natural language processing tasks. In this approach, each word corresponds to a particular set of meaningful numerical parameters. Because options convey the meaning, it becomes possible to use them instead of the text view. These vectors can be used as features for different applications, such as information retrieval tasks, documents’ classification, answers to questions, extracting named entity and text parsing.
Glove, word2vec, word embedding
Короткий адрес: https://sciup.org/140270827
IDR: 140270827