In the context of finite sums minimization, variance reduction techniques are widely used to improve the performance of state-of-the-art stochastic gradient methods. Their practical impact is clear, as well as their theoretical properties. Stochastic proximal point algorithms have been studied as an alternative to stochastic gradient algorithms since they are more stable with respect to the choice of the step size. However, their variance-reduced versions are not as well studied as the gradient ones. In this work, we propose the first unified study of variance reduction techniques for stochastic proximal point algorithms. We introduce a generic stochastic proximal-based algorithm that can be specified to give the proximal version of SVRG, SAGA, and some of their variants. For this algorithm, in the smooth setting, we provide several convergence rates for the iterates and the objective function values, which are faster than those of the vanilla stochastic proximal point algorithm. More specifically, for convex functions, we prove a sublinear convergence rate of O(1/k). In addition, under the Polyak-& lstrok;ojasiewicz condition, we obtain linear convergence rates. Finally, our numerical experiments demonstrate the advantages of the proximal variance reduction methods over their gradient counterparts in terms of the stability with respect to the choice of the step size in most cases, especially for difficult problems.

Variance Reduction Techniques for Stochastic Proximal Point Algorithms / Traore, Cheik; Apidopoulos, Vassilis; Salzo, Saverio; Villa, Silvia. - In: JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS. - ISSN 1573-2878. - (2024). [10.1007/s10957-024-02502-6]

Variance Reduction Techniques for Stochastic Proximal Point Algorithms

Saverio Salzo;
2024

Abstract

In the context of finite sums minimization, variance reduction techniques are widely used to improve the performance of state-of-the-art stochastic gradient methods. Their practical impact is clear, as well as their theoretical properties. Stochastic proximal point algorithms have been studied as an alternative to stochastic gradient algorithms since they are more stable with respect to the choice of the step size. However, their variance-reduced versions are not as well studied as the gradient ones. In this work, we propose the first unified study of variance reduction techniques for stochastic proximal point algorithms. We introduce a generic stochastic proximal-based algorithm that can be specified to give the proximal version of SVRG, SAGA, and some of their variants. For this algorithm, in the smooth setting, we provide several convergence rates for the iterates and the objective function values, which are faster than those of the vanilla stochastic proximal point algorithm. More specifically, for convex functions, we prove a sublinear convergence rate of O(1/k). In addition, under the Polyak-& lstrok;ojasiewicz condition, we obtain linear convergence rates. Finally, our numerical experiments demonstrate the advantages of the proximal variance reduction methods over their gradient counterparts in terms of the stability with respect to the choice of the step size in most cases, especially for difficult problems.
2024
Stochastic optimization; Proximal point algorithm; Variance reduction techniques; SVRG; SAGA
01 Pubblicazione su rivista::01a Articolo in rivista
Variance Reduction Techniques for Stochastic Proximal Point Algorithms / Traore, Cheik; Apidopoulos, Vassilis; Salzo, Saverio; Villa, Silvia. - In: JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS. - ISSN 1573-2878. - (2024). [10.1007/s10957-024-02502-6]
File allegati a questo prodotto
File Dimensione Formato  
Traore_Variance_2024.pdf

accesso aperto

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 1.41 MB
Formato Adobe PDF
1.41 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1722251
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact