We study the variable metric forward-backward splitting algorithm for convex minimization problems without the standard assumption of the Lipschitz continuity of the gradient. In this setting, we prove that, by requiring only mild assumptions on the smooth part of the objective function and using several types of line search procedures for determining either the gradient descent stepsizes or the relaxation parameters, one still obtains weak convergence of the iterates and convergence in the objective function values. Moreover, the o(1/k) convergence rate in the function values is obtained if slightly stronger differentiability assumptions are added. We also illustrate several applications including problems that involve Banach spaces and functions of divergence type.
THE VARIABLE METRIC FORWARD-BACKWARD SPLITTING ALGORITHM UNDER MILD DIFFERENTIABILITY ASSUMPTIONS / Salzo, S. - In: SIAM JOURNAL ON OPTIMIZATION. - ISSN 1052-6234. - 27:4(2017), pp. 2153-2181. [10.1137/16M1073741]
THE VARIABLE METRIC FORWARD-BACKWARD SPLITTING ALGORITHM UNDER MILD DIFFERENTIABILITY ASSUMPTIONS
Salzo S
2017
Abstract
We study the variable metric forward-backward splitting algorithm for convex minimization problems without the standard assumption of the Lipschitz continuity of the gradient. In this setting, we prove that, by requiring only mild assumptions on the smooth part of the objective function and using several types of line search procedures for determining either the gradient descent stepsizes or the relaxation parameters, one still obtains weak convergence of the iterates and convergence in the objective function values. Moreover, the o(1/k) convergence rate in the function values is obtained if slightly stronger differentiability assumptions are added. We also illustrate several applications including problems that involve Banach spaces and functions of divergence type.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.