Convex optimization plays a key role in data science and image processing. Indeed, from one hand it provides theoretical frameworks, such as duality theory and the theory of nonexpansive operators, which are indispensable to formally analyze many problems arising in those fields. On the other hand, convex optimization supplies a plethora of algorithmic solutions covering a broad range of applications. In particular, the last decades witnessed an unprecedented development of optimization methods which are now capable of addressing structured and large-scale problems effectively. An important class of such methods, which are at the core of modern nonlinear convex optimization, is that of proximal gradient splitting algorithms. They are first-order methods which are tailored to optimization problems having a composite structure given by the sum of smooth and nonsmooth terms. These methods are splitting algorithms, in the sense that along the iterations they process each term separately by exploiting gradient information when available and the so-called proximity operator for nonsmooth terms. Even though there is a rich literature on proximal gradient algorithms, in this contribution, we paid particular attention to presenting a self-contained and unifying analysis for the various algorithms, unveiling common theoretical basis. We give state-of-the-art results treating both convergence of the iterates and of objective functions values in infinite-dimensional setting. This work is based on the lecture notes written for the PhD course “Introduction to Convex Optimization” that was given by the authors at the University of Genoa during the last 5 years.

Proximal Gradient Methods for Machine Learning and Imaging / Salzo, Saverio; Villa, Silvia. - (2021), pp. 149-244. - APPLIED AND NUMERICAL HARMONIC ANALYSIS. [10.1007/978-3-030-86664-8_4].

Proximal Gradient Methods for Machine Learning and Imaging

Salzo, Saverio
;
2021

Abstract

Convex optimization plays a key role in data science and image processing. Indeed, from one hand it provides theoretical frameworks, such as duality theory and the theory of nonexpansive operators, which are indispensable to formally analyze many problems arising in those fields. On the other hand, convex optimization supplies a plethora of algorithmic solutions covering a broad range of applications. In particular, the last decades witnessed an unprecedented development of optimization methods which are now capable of addressing structured and large-scale problems effectively. An important class of such methods, which are at the core of modern nonlinear convex optimization, is that of proximal gradient splitting algorithms. They are first-order methods which are tailored to optimization problems having a composite structure given by the sum of smooth and nonsmooth terms. These methods are splitting algorithms, in the sense that along the iterations they process each term separately by exploiting gradient information when available and the so-called proximity operator for nonsmooth terms. Even though there is a rich literature on proximal gradient algorithms, in this contribution, we paid particular attention to presenting a self-contained and unifying analysis for the various algorithms, unveiling common theoretical basis. We give state-of-the-art results treating both convergence of the iterates and of objective functions values in infinite-dimensional setting. This work is based on the lecture notes written for the PhD course “Introduction to Convex Optimization” that was given by the authors at the University of Genoa during the last 5 years.
2021
Harmonic and Applied Analysis. From Radon Transforms to Machine Learning.
978-3-030-86663-1
Convex optimization; proximal gradient method; machine learning
02 Pubblicazione su volume::02a Capitolo o Articolo
Proximal Gradient Methods for Machine Learning and Imaging / Salzo, Saverio; Villa, Silvia. - (2021), pp. 149-244. - APPLIED AND NUMERICAL HARMONIC ANALYSIS. [10.1007/978-3-030-86664-8_4].
File allegati a questo prodotto
File Dimensione Formato  
Salzo_Proximal Gradient_2021.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 998.79 kB
Formato Adobe PDF
998.79 kB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1654454
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? ND
social impact