A novel learning technique is described as a faster and more reliable alternative to the classical back-propagation method. The approach is based on the application of Least Squares criterion to a linearized system at each step of the learning procedure. The squared error at the output of each layer immediately before the non linearity is minimized over the entire training set by a Block recursive Least Squares algorithm. The optimal weights (in the sense of minimal 2-norm of the error) are computed for each layer using the QR decomposition. The high performance of the new algorithm has been verified in several experimental trials, yielding considerable improvements from the point of view of both the accuracy and the speed of convergence.

Block-recursive least squares technique for training multilayer perceptrons / DI CLAUDIO, Elio; Parisi, Raffaele; Orlandi, Gianni. - STAMPA. - III:(1994), pp. 528-532. (Intervento presentato al convegno 1994 World Conference on Neural Networks (WCNN’94) tenutosi a San Diego - USA).

Block-recursive least squares technique for training multilayer perceptrons

DI CLAUDIO, Elio;PARISI, Raffaele;ORLANDI, Gianni
1994

Abstract

A novel learning technique is described as a faster and more reliable alternative to the classical back-propagation method. The approach is based on the application of Least Squares criterion to a linearized system at each step of the learning procedure. The squared error at the output of each layer immediately before the non linearity is minimized over the entire training set by a Block recursive Least Squares algorithm. The optimal weights (in the sense of minimal 2-norm of the error) are computed for each layer using the QR decomposition. The high performance of the new algorithm has been verified in several experimental trials, yielding considerable improvements from the point of view of both the accuracy and the speed of convergence.
1994
1994 World Conference on Neural Networks (WCNN’94)
FEEDFORWARD NEURAL NETWORKS; RECURSIVE LEAST SQUARES TECHNIQUE; FAST LEARNING ALGORITHMS; PERTURBATION TECHNIQUES; BACKPROPAGATION ALGORITHMS
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Block-recursive least squares technique for training multilayer perceptrons / DI CLAUDIO, Elio; Parisi, Raffaele; Orlandi, Gianni. - STAMPA. - III:(1994), pp. 528-532. (Intervento presentato al convegno 1994 World Conference on Neural Networks (WCNN’94) tenutosi a San Diego - USA).
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/390752
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? 0
social impact