Information entropy as a cause of converging syntactic structures in typologically different languages (in Russian and English languages)

Бесплатный доступ

The paper deals with information entropy as a systemic issue that underlies a natural language and its syntax in particular, where it can cause convergence of surface syntactic structures. It is shown that the amount of entropy varies throughout languages which follows from comparison of Russian and English data. A method of calculating the entropy of a natural language is proposed, as well as means to bypass it in a rigorous analysis of syntactic structures.

Entropy, convergence, syntax, semantics, content, form

Короткий адрес: https://sciup.org/14969306

IDR: 14969306

Краткое сообщение