Elements of the statistical learning concept for a neural network and accurate prediction of its operation

Автор: Malychina G.F., Merkusheva A.V.

Журнал: Научное приборостроение @nauchnoe-priborostroenie

Рубрика: Обзоры

Статья в выпуске: 1 т.15, 2005 года.

Бесплатный доступ

The learning of neural networks (NN) for many problems (pattern recognition, nonlinear multi-parameter regression, probability distribution identification) is considered in generalized form on the basis of a concept that includes probabilistic interpretation for the NN input-output transfer function and basic notions having a mathematically formalized foundation: diversity (a set) of mapping being realized by NN (and a set of loss functions isomorphic to it); characteristics of that diversity on the basis of entropy and Vapnik-Chervonenkis dimension; risk functional (RF) and a condition allowing RF approximation by means of an empirical risk functional (ERF); the limits of the actual RF departure from ERF. The elements of the leaning statistical theory described here provide prediction and correction ("control") of the NN operation index after leaning, i.e. at the stage of NN testing with the data on not participating in learning.

Еще

Короткий адрес: https://sciup.org/14264368

IDR: 14264368

Статья обзорная