The multilayer perceptron is one of the most commonly used types of feedforward neural networks and it is used in a large number of applications. Its stength resides in its capacity of mapping arbitrarily complex nonlinear functions by a convenient number of layers of sigmoidal nonlinearities (Rumelhart et al., 1986). the back propagation algorithm is still the most used learning algorithms; it consists in the minimization of the Mean squared error (MSE) at the network output performed by means of a gradient descent on the error surface in the space of weights. the backpropagation algorithm suffers from a number of shortcomings; above all the relatively slow rate of convergence and the final misadjustment that cannot guarantee the success of the training procedure in real applications. the choice of a different learning algorithm, based on a different minimization criterion, can help to overcome these drawbacks (see Azimi-Sajadi et al., 1992) and (scalero et al., 1992) for some examples of LS-based fast learning algorithms). The BRLS training algorithm next described allows to obtain considerable improvements from the point of view of both the numerical accuracy and the speed of convergence.

Block-recursive least squares technique for training multilayer perceptrons / DI CLAUDIO, Elio; Parisi, Raffaele; Orlandi, Gianni. - STAMPA. - (1994), pp. 565-568. (Intervento presentato al convegno 1994 International Conference on Artificial Neural Networks (ICANN’94) tenutosi a Sorrento).

Block-recursive least squares technique for training multilayer perceptrons

DI CLAUDIO, Elio;PARISI, Raffaele;ORLANDI, Gianni
1994

Abstract

The multilayer perceptron is one of the most commonly used types of feedforward neural networks and it is used in a large number of applications. Its stength resides in its capacity of mapping arbitrarily complex nonlinear functions by a convenient number of layers of sigmoidal nonlinearities (Rumelhart et al., 1986). the back propagation algorithm is still the most used learning algorithms; it consists in the minimization of the Mean squared error (MSE) at the network output performed by means of a gradient descent on the error surface in the space of weights. the backpropagation algorithm suffers from a number of shortcomings; above all the relatively slow rate of convergence and the final misadjustment that cannot guarantee the success of the training procedure in real applications. the choice of a different learning algorithm, based on a different minimization criterion, can help to overcome these drawbacks (see Azimi-Sajadi et al., 1992) and (scalero et al., 1992) for some examples of LS-based fast learning algorithms). The BRLS training algorithm next described allows to obtain considerable improvements from the point of view of both the numerical accuracy and the speed of convergence.
1994
1994 International Conference on Artificial Neural Networks (ICANN’94)
BACKPROPAGATION ALGORITHMS; FEEDFORWARD NEURAL NETWORKS; BLOCK RECURSIVE LEAST SQUARES ALGORITHM; PERTURBATION TECHNIQUES
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Block-recursive least squares technique for training multilayer perceptrons / DI CLAUDIO, Elio; Parisi, Raffaele; Orlandi, Gianni. - STAMPA. - (1994), pp. 565-568. (Intervento presentato al convegno 1994 International Conference on Artificial Neural Networks (ICANN’94) tenutosi a Sorrento).
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/390746
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact