We consider the problem of minimizing a smooth nonconvex function over a structured convex feasible set, that is, defined by two sets of constraints that are easy to treat when considered separately. In order to exploit the structure of the problem, we define an equivalent formulation by duplicating the variables and we consider the augmented Lagrangian of this latter formulation. Following the idea of the Alternating Direction Method of Multipliers (ADMM), we propose an algorithm where a two-blocks decomposition method is embedded within an augmented Lagrangian framework. The peculiarities of the proposed algorithm are the following: (1) the computation of the exact solution of a possibly nonconvex subproblem is not required; (2) the penalty parameter is iteratively updated once an approximated stationary point of the augmented Lagrangian is determined. Global convergence results are stated under mild assumptions and without requiring convexity of the objective function. Although the primary aim of the paper is theoretical, we perform numerical experiments on a nonconvex problem arising in machine learning, and the obtained results show the practical advantages of the proposed approach with respect to classical ADMM.

An Alternating Augmented Lagrangian method for constrained nonconvex optimization / Galvan, G.; Lapucci, M.; Levato, T.; Sciandrone, M.. - In: OPTIMIZATION METHODS & SOFTWARE. - ISSN 1055-6788. - 35:3(2020), pp. 502-520. [10.1080/10556788.2019.1576177]

An Alternating Augmented Lagrangian method for constrained nonconvex optimization

Galvan G.
;
Sciandrone M.
2020

Abstract

We consider the problem of minimizing a smooth nonconvex function over a structured convex feasible set, that is, defined by two sets of constraints that are easy to treat when considered separately. In order to exploit the structure of the problem, we define an equivalent formulation by duplicating the variables and we consider the augmented Lagrangian of this latter formulation. Following the idea of the Alternating Direction Method of Multipliers (ADMM), we propose an algorithm where a two-blocks decomposition method is embedded within an augmented Lagrangian framework. The peculiarities of the proposed algorithm are the following: (1) the computation of the exact solution of a possibly nonconvex subproblem is not required; (2) the penalty parameter is iteratively updated once an approximated stationary point of the augmented Lagrangian is determined. Global convergence results are stated under mild assumptions and without requiring convexity of the objective function. Although the primary aim of the paper is theoretical, we perform numerical experiments on a nonconvex problem arising in machine learning, and the obtained results show the practical advantages of the proposed approach with respect to classical ADMM.
2020
49M27; 49M37; 65K05; 90C30; Alternating direction method of multipliers; augmented Lagrangian; decomposition; global convergence; nonconvex objective function; penalty method
01 Pubblicazione su rivista::01a Articolo in rivista
An Alternating Augmented Lagrangian method for constrained nonconvex optimization / Galvan, G.; Lapucci, M.; Levato, T.; Sciandrone, M.. - In: OPTIMIZATION METHODS & SOFTWARE. - ISSN 1055-6788. - 35:3(2020), pp. 502-520. [10.1080/10556788.2019.1576177]
File allegati a questo prodotto
File Dimensione Formato  
Galvan_An-Alternating-Augmented_2020.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 2.11 MB
Formato Adobe PDF
2.11 MB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1625420
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 3
social impact