Neural ordinary differential equations and their probabilistic extension

Автор: Margasov A.O.

Журнал: Известия Коми научного центра УрО РАН @izvestia-komisc

Статья в выпуске: 6 (52), 2021 года.

Бесплатный доступ

This paper describes the transition from neural net-work architecture to ordinary differential equationsand initial value problem. Two neural network architectures are compared: classical RNN and ODE-RNN, which uses neural ordinary differential equations. The paper proposes a new architecture ofp-ODE-RNN, which allows you to achieve a quality comparable to ODE-RNN, but is trained muchfaster. Furthermore, the derivation of the proposedarchitecture in terms of random process theory isdiscussed.

Ordinary differential equations, neural networks, stochastic processes, probability distributions

Короткий адрес: https://sciup.org/149139082

IDR: 149139082   |   DOI: 10.19110/1994-5655-2021-6-14-19

Статья научная