In this paper a review of fast-learning algorithms for multilayer neural networks is presented. From the discovery of the Back-propagation algorithm several efforts have been made in order to improve the speed of convergence of the learning. A general approach is to consider the training of a neural net as a nonlinear optimization problem; this makes available a number of techniques already tested and well known in other fields. Recently some methods drawn from the signal processing field have been introduced; these solutions are strictly connected to the point of view of the optimization theory. In particular we show the feasibility of Least Squares and Total least Squares solutions for the learning problem; these approaches lead to fast and robust algorithms whose performance can be justified by recasting them in the optimization framework.

Fast learning algorithms for feedforward neural networks / Parisi, Raffaele; DI CLAUDIO, Elio; Orlandi, Gianni. - STAMPA. - (1995), pp. 58-78. (Intervento presentato al convegno VII Italian Workshop on Neural Net - WIRN Vietri ‘95 tenutosi a Vietri sul Mare (SA), Italy nel Maggio 1995).

Fast learning algorithms for feedforward neural networks

PARISI, Raffaele;DI CLAUDIO, Elio;ORLANDI, Gianni
1995

Abstract

In this paper a review of fast-learning algorithms for multilayer neural networks is presented. From the discovery of the Back-propagation algorithm several efforts have been made in order to improve the speed of convergence of the learning. A general approach is to consider the training of a neural net as a nonlinear optimization problem; this makes available a number of techniques already tested and well known in other fields. Recently some methods drawn from the signal processing field have been introduced; these solutions are strictly connected to the point of view of the optimization theory. In particular we show the feasibility of Least Squares and Total least Squares solutions for the learning problem; these approaches lead to fast and robust algorithms whose performance can be justified by recasting them in the optimization framework.
1995
VII Italian Workshop on Neural Net - WIRN Vietri ‘95
FEEDFORWARD NEURAL NETWORKS; RECURSIVE LEAST SQUARES TECHNIQUE; PERTURBATION METHODS; BACKPROPAGATION
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Fast learning algorithms for feedforward neural networks / Parisi, Raffaele; DI CLAUDIO, Elio; Orlandi, Gianni. - STAMPA. - (1995), pp. 58-78. (Intervento presentato al convegno VII Italian Workshop on Neural Net - WIRN Vietri ‘95 tenutosi a Vietri sul Mare (SA), Italy nel Maggio 1995).
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/202633
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact