We propose a decomposition framework for the parallel optimization of the sum of a differentiable (possibly nonconvex) function and a nonsmooth (possibly nonseparable), convex one. The latter term is usually employed to enforce structure in the solution, typically sparsity. The main contribution of this work is a novel parallel, hybrid random/deterministic decomposition scheme wherein, at each iteration, a subset of (block) variables is updated at the same time by minimizing a convex surrogate of the original nonconvex function. To tackle huge-scale problems, the (block) variables to be updated are chosen according to a mixed random and deterministic procedure, which captures the advantages of both pure deterministic and random update-based schemes. Almost sure convergence of the proposed scheme is established. Numerical results show that on huge-scale problems the proposed hybrid random/deterministic algorithm compares favorably to random and deterministic schemes on both convex and nonconvex problems.

Hybrid random/deterministic parallel algorithms for convex and nonconvex Big Data optimization / Daneshmand, Amir; Facchinei, Francisco; Kungurtsev, Vyacheslav; Scutari, Gesualdo. - In: IEEE TRANSACTIONS ON SIGNAL PROCESSING. - ISSN 1053-587X. - STAMPA. - 63:15(2015), pp. 3914-3929. [10.1109/TSP.2015.2436357]

Hybrid random/deterministic parallel algorithms for convex and nonconvex Big Data optimization

FACCHINEI, Francisco
;
2015

Abstract

We propose a decomposition framework for the parallel optimization of the sum of a differentiable (possibly nonconvex) function and a nonsmooth (possibly nonseparable), convex one. The latter term is usually employed to enforce structure in the solution, typically sparsity. The main contribution of this work is a novel parallel, hybrid random/deterministic decomposition scheme wherein, at each iteration, a subset of (block) variables is updated at the same time by minimizing a convex surrogate of the original nonconvex function. To tackle huge-scale problems, the (block) variables to be updated are chosen according to a mixed random and deterministic procedure, which captures the advantages of both pure deterministic and random update-based schemes. Almost sure convergence of the proposed scheme is established. Numerical results show that on huge-scale problems the proposed hybrid random/deterministic algorithm compares favorably to random and deterministic schemes on both convex and nonconvex problems.
2015
Nonconvex problems; Parallel and distributed methods; Random selections; Jacobi method; Sparse solution;
01 Pubblicazione su rivista::01a Articolo in rivista
Hybrid random/deterministic parallel algorithms for convex and nonconvex Big Data optimization / Daneshmand, Amir; Facchinei, Francisco; Kungurtsev, Vyacheslav; Scutari, Gesualdo. - In: IEEE TRANSACTIONS ON SIGNAL PROCESSING. - ISSN 1053-587X. - STAMPA. - 63:15(2015), pp. 3914-3929. [10.1109/TSP.2015.2436357]
File allegati a questo prodotto
File Dimensione Formato  
Daneshmand_Hybrid_2015.pdf

solo gestori archivio

Note: https://ieeexplore.ieee.org/abstract/document/7113892/
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 4.19 MB
Formato Adobe PDF
4.19 MB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/845067
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 57
  • ???jsp.display-item.citation.isi??? 51
social impact