The paper presents an overview of global issues in optimizationmethods for training feedforward neural networks (FNN) in a regression setting.We first recall the learning optimization paradigm for FNN and we briefly discuss global scheme for the joint choice of the network topologies and of the network parameters. The main part of the paper focuses on the core subproblem which is the continuous unconstrained (regularized) weights optimization problem with the aim of reviewing global methods specifically arising both in multi layer perceptron/deep networks and in radial basis networks.We review some recent results on the existence of non-global stationary points of the unconstrained nonlinear problem and the role of determining a global solution in a supervised learning paradigm. Local algorithms that are widespread used to solve the continuous unconstrained problems are addressed with focus on possible improvements to exploit the global properties. Hybrid global methods specifically devised for FNN training optimization problems which embed local algorithms are discussed too.

Global optimization issues in deep network regression: an overview / Palagi, Laura. - In: JOURNAL OF GLOBAL OPTIMIZATION. - ISSN 0925-5001. - 73:2(2019), pp. 239-277. [10.1007/s10898-018-0701-7]

Global optimization issues in deep network regression: an overview

palagi laura
2019

Abstract

The paper presents an overview of global issues in optimizationmethods for training feedforward neural networks (FNN) in a regression setting.We first recall the learning optimization paradigm for FNN and we briefly discuss global scheme for the joint choice of the network topologies and of the network parameters. The main part of the paper focuses on the core subproblem which is the continuous unconstrained (regularized) weights optimization problem with the aim of reviewing global methods specifically arising both in multi layer perceptron/deep networks and in radial basis networks.We review some recent results on the existence of non-global stationary points of the unconstrained nonlinear problem and the role of determining a global solution in a supervised learning paradigm. Local algorithms that are widespread used to solve the continuous unconstrained problems are addressed with focus on possible improvements to exploit the global properties. Hybrid global methods specifically devised for FNN training optimization problems which embed local algorithms are discussed too.
2019
Supervised learning; Deep networks; Feedforward neural networks; Global optimization; Weights optimization; Hybrid algorithms
01 Pubblicazione su rivista::01a Articolo in rivista
Global optimization issues in deep network regression: an overview / Palagi, Laura. - In: JOURNAL OF GLOBAL OPTIMIZATION. - ISSN 0925-5001. - 73:2(2019), pp. 239-277. [10.1007/s10898-018-0701-7]
File allegati a questo prodotto
File Dimensione Formato  
Palagi_Global-Optimization_2019.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 1.02 MB
Formato Adobe PDF
1.02 MB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1180428
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 12
  • ???jsp.display-item.citation.isi??? 6
social impact