Information entropy as a cause of converging syntactic structures in typologically different languages (in Russian and English languages)
Автор: Amatov Alexander Mikhailovich
Журнал: Вестник Волгоградского государственного университета. Серия 2: Языкознание @jvolsu-linguistics
Рубрика: Межкультурная коммуникация и сопоставительное изучение языков
Статья в выпуске: 2 (8), 2008 года.
Бесплатный доступ
The paper deals with information entropy as a systemic issue that underlies a natural language and its syntax in particular, where it can cause convergence of surface syntactic structures. It is shown that the amount of entropy varies throughout languages which follows from comparison of Russian and English data. A method of calculating the entropy of a natural language is proposed, as well as means to bypass it in a rigorous analysis of syntactic structures.
Entropy, convergence, syntax, semantics, content, form
Короткий адрес: https://sciup.org/14969306
IDR: 14969306