In this work a novel approach to the training of recurrent neural nets is presented. The algorithm exploits the separability of each neuron into its linear and nonlinear part. Each iteration of the learning consists of two steps: First the descent of the error functional in the space of the linear outputs of the neurons is performed (descent in the neuron space); then the weights are updated by solving a linear system with a recursive least squares technique. The main properties of the new approach are high speed of convergence, favorable numerical conditioning and robustness. the numerical stability is assured by the use of robust LS linear system solvers, operating directly on the data.

Training recurrent neural networks by the recursive least squares algorithm / Parisi, Raffaele; DI CLAUDIO, Elio; Rapagnetta, A.; Orlandi, Gianni. - STAMPA. - (1996), pp. 283-286. (Intervento presentato al convegno World Congress on Neural Networks, WCNN’96 tenutosi a S. Diego, CA, USA nel September 15-18, 1996).

Training recurrent neural networks by the recursive least squares algorithm

PARISI, Raffaele;DI CLAUDIO, Elio;ORLANDI, Gianni
1996

Abstract

In this work a novel approach to the training of recurrent neural nets is presented. The algorithm exploits the separability of each neuron into its linear and nonlinear part. Each iteration of the learning consists of two steps: First the descent of the error functional in the space of the linear outputs of the neurons is performed (descent in the neuron space); then the weights are updated by solving a linear system with a recursive least squares technique. The main properties of the new approach are high speed of convergence, favorable numerical conditioning and robustness. the numerical stability is assured by the use of robust LS linear system solvers, operating directly on the data.
1996
0805826084
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/242745
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact