We propose a decomposition framework for the parallel optimization of the sum of a differentiable (possibly nonconvex) function and a nonsmooth (separable), convex one. The latter term is usually employed to enforce structure in the solution, typically sparsity. The main contribution of this work is a novel parallel, hybrid random/deterministic decomposition scheme wherein, at each iteration, a subset of (block) variables is updated at the same time by minimizing local convex approximations of the original nonconvex function. To tackle with huge-scale problems, the (block) variables to be updated are chosen according to a mixed random and deterministic procedure, which captures the advantages of both pure deterministic and random update-based schemes. Almost sure convergence of the proposed scheme is established. Numerical results on huge-scale problems show that the proposed algorithm outperforms current schemes.

Flexible selective parallel algorithms for big data optimization / Daneshmand, Amir; Facchinei, Francisco; Kungurtsev, Vyacheslav; Scutari, Gesualdo. - STAMPA. - (2014), pp. 3-7. (Intervento presentato al convegno 48h Asilomar Conference on Signals, Systems and Computers tenutosi a Grove; United States nel 02-05 November 2014) [10.1109/ACSSC.2014.7094384].

Flexible selective parallel algorithms for big data optimization

FACCHINEI, Francisco
;
2014

Abstract

We propose a decomposition framework for the parallel optimization of the sum of a differentiable (possibly nonconvex) function and a nonsmooth (separable), convex one. The latter term is usually employed to enforce structure in the solution, typically sparsity. The main contribution of this work is a novel parallel, hybrid random/deterministic decomposition scheme wherein, at each iteration, a subset of (block) variables is updated at the same time by minimizing local convex approximations of the original nonconvex function. To tackle with huge-scale problems, the (block) variables to be updated are chosen according to a mixed random and deterministic procedure, which captures the advantages of both pure deterministic and random update-based schemes. Almost sure convergence of the proposed scheme is established. Numerical results on huge-scale problems show that the proposed algorithm outperforms current schemes.
2014
48h Asilomar Conference on Signals, Systems and Computers
Signal Processing; Computer Networks and Communications
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Flexible selective parallel algorithms for big data optimization / Daneshmand, Amir; Facchinei, Francisco; Kungurtsev, Vyacheslav; Scutari, Gesualdo. - STAMPA. - (2014), pp. 3-7. (Intervento presentato al convegno 48h Asilomar Conference on Signals, Systems and Computers tenutosi a Grove; United States nel 02-05 November 2014) [10.1109/ACSSC.2014.7094384].
File allegati a questo prodotto
File Dimensione Formato  
Daneshmand_Flexible_2014.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 311.74 kB
Formato Adobe PDF
311.74 kB Adobe PDF   Contatta l'autore
VE_2014_11573-951031.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 312.78 kB
Formato Adobe PDF
312.78 kB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/951031
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 5
social impact