On applicability of recurrent neural networks to language modelling for inflective languages
Автор: Kudinov Mikhail S.
Журнал: Журнал Сибирского федерального университета. Серия: Техника и технологии @technologies-sfu
Статья в выпуске: 8 т.9, 2016 года.
Бесплатный доступ
Standard version of recurrent neural network language model (RNNLM) has shown modest results in language modelling of Russian. In this paper we present a special modifi cation of RNNLM making separate predictions of lemmas and morphology. New model shows superior results compared to Knesser-Ney language model both in perplexity and in ranking experiment. At the same time morphology integration has not shown any improvement.
Language models, recurrent neural network, inflected languages, speech recognition
Короткий адрес: https://sciup.org/146115162
IDR: 146115162 | DOI: 10.17516/1999-494X-2016-9-8-1291-1301