Convex optimization plays a key role in data science and image processing. Indeed, from one hand it provides theoretical frameworks, such as duality theory and the theory of nonexpansive operators, which are indispensable to formally analyze many problems arising in those fields. On the other hand, convex optimization supplies a plethora of algorithmic solutions covering a broad range of applications. In particular, the last decades witnessed an unprecedented development of optimization methods which are now capable of addressing structured and large-scale problems effectively. An important class of such methods, which are at the core of modern nonlinear convex optimization, is that of proximal gradient splitting algorithms. They are first-order methods which are tailored to optimization problems having a composite structure given by the sum of smooth and nonsmooth terms. These methods are splitting algorithms, in the sense that along the iterations they process each term separately by exploiting gradient information when available and the so-called proximity operator for nonsmooth terms. Even though there is a rich literature on proximal gradient algorithms, in this contribution, we paid particular attention to presenting a self-contained and unifying analysis for the various algorithms, unveiling common theoretical basis. We give state-of-the-art results treating both convergence of the iterates and of objective functions values in infinite-dimensional setting. This work is based on the lecture notes written for the PhD course “Introduction to Convex Optimization” that was given by the authors at the University of Genoa during the last 5 years.
Proximal Gradient Methods for Machine Learning and Imaging / Salzo, Saverio; Villa, Silvia. - (2021), pp. 149-244. - APPLIED AND NUMERICAL HARMONIC ANALYSIS. [10.1007/978-3-030-86664-8_4].
Proximal Gradient Methods for Machine Learning and Imaging
Salzo, Saverio
;
2021
Abstract
Convex optimization plays a key role in data science and image processing. Indeed, from one hand it provides theoretical frameworks, such as duality theory and the theory of nonexpansive operators, which are indispensable to formally analyze many problems arising in those fields. On the other hand, convex optimization supplies a plethora of algorithmic solutions covering a broad range of applications. In particular, the last decades witnessed an unprecedented development of optimization methods which are now capable of addressing structured and large-scale problems effectively. An important class of such methods, which are at the core of modern nonlinear convex optimization, is that of proximal gradient splitting algorithms. They are first-order methods which are tailored to optimization problems having a composite structure given by the sum of smooth and nonsmooth terms. These methods are splitting algorithms, in the sense that along the iterations they process each term separately by exploiting gradient information when available and the so-called proximity operator for nonsmooth terms. Even though there is a rich literature on proximal gradient algorithms, in this contribution, we paid particular attention to presenting a self-contained and unifying analysis for the various algorithms, unveiling common theoretical basis. We give state-of-the-art results treating both convergence of the iterates and of objective functions values in infinite-dimensional setting. This work is based on the lecture notes written for the PhD course “Introduction to Convex Optimization” that was given by the authors at the University of Genoa during the last 5 years.File | Dimensione | Formato | |
---|---|---|---|
Salzo_Proximal Gradient_2021.pdf
solo gestori archivio
Tipologia:
Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
998.79 kB
Formato
Adobe PDF
|
998.79 kB | Adobe PDF | Contatta l'autore |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.