Gradient methods for optimization problems that allow for the existence of an inexact strongly convex model of the objective function
Автор: Agafonov A.D., Stonyakin F.S.
Журнал: Труды Московского физико-технического института @trudy-mipt
Рубрика: Информатика и управление
Статья в выпуске: 3 (43) т.11, 2019 года.
Бесплатный доступ
In this paper, some analogs of the Devolder-Glineu-Nesterov (d, L, m)-oracle are introduced for optimization problems.At the same time, various types of conditions of relative smoothness and relative strong convexity of the objective function are highlighted. Examples of convex and strongly convex optimization problems admitting the existence of inexact models of this type, are given. In particular, these are problems of strongly convex composite optimization and the solution of optimization problem arising in the Y. Nesterov’s electoral model. The adaptive and nonadaptive gradient methods for optimization problems that allow for the inexact model are studied. The linear rate of convergence of these methods is justified and it is shown that there is no error accumulation on iterations. Some numerical experiments comparing the rate of convergence of adaptive and nonadaptive methods are presented. An approach to the problem of the accumulation of errors for the fast gradient method is proposed using a special technique of its restarts.
Gradient decent, fast gradient descent, restarts, function model, composite optimization
Короткий адрес: https://sciup.org/142223075
IDR: 142223075