Sufficient conditions of optimality for optimal control problems of logic-dynamic systems

Автор: Maltugueva Nadezhda Stanislavovna

Журнал: Программные системы: теория и приложения @programmnye-sistemy

Статья в выпуске: 1 (5) т.2, 2011 года.

Бесплатный доступ

This article deals with logic-dynamic systems, it's a special class of discrete-continuous control systems. Discrete component in these systems is an integervalued function, which has a finite number of discontinuity points. The optimal control problem is formulated for this kind of systems. The problem under consideration differs from the classical optimal control problem that the right-hand sides of differential equations and functional have the discrete variables. In articles of A.S. Bortakovskii sufficient conditions of optimality are proved for the Bellman function. But this theorem is true for any function Krotov, and the author of this work showed this. Also in the article it's described an approach to the construction of computational procedures for this problem.

Еще

Control systems, nonlocal improvement

Короткий адрес: https://sciup.org/14335898

IDR: 14335898

Статья научная