The multilayer perceptron is one of the most commonly used types of feedforward neural networks and it is used in a large number of applications. Its stength resides in its capacity of mapping arbitrarily complex nonlinear functions by a convenient number of layers of sigmoidal nonlinearities (Rumelhart et al., 1986). the back propagation algorithm is still the most used learning algorithms; it consists in the minimization of the Mean squared error (MSE) at the network output performed by means of a gradient descent on the error surface in the space of weights. the backpropagation algorithm suffers from a number of shortcomings; above all the relatively slow rate of convergence and the final misadjustment that cannot guarantee the success of the training procedure in real applications. the choice of a different learning algorithm, based on a different minimization criterion, can help to overcome these drawbacks (see Azimi-Sajadi et al., 1992) and (scalero et al., 1992) for some examples of LS-based fast learning algorithms). The BRLS training algorithm next described allows to obtain considerable improvements from the point of view of both the numerical accuracy and the speed of convergence.
Block-recursive least squares technique for training multilayer perceptrons / DI CLAUDIO, Elio; Parisi, Raffaele; Orlandi, Gianni. - STAMPA. - (1994), pp. 565-568. (Intervento presentato al convegno 1994 International Conference on Artificial Neural Networks (ICANN’94) tenutosi a Sorrento).
Block-recursive least squares technique for training multilayer perceptrons
DI CLAUDIO, Elio;PARISI, Raffaele;ORLANDI, Gianni
1994
Abstract
The multilayer perceptron is one of the most commonly used types of feedforward neural networks and it is used in a large number of applications. Its stength resides in its capacity of mapping arbitrarily complex nonlinear functions by a convenient number of layers of sigmoidal nonlinearities (Rumelhart et al., 1986). the back propagation algorithm is still the most used learning algorithms; it consists in the minimization of the Mean squared error (MSE) at the network output performed by means of a gradient descent on the error surface in the space of weights. the backpropagation algorithm suffers from a number of shortcomings; above all the relatively slow rate of convergence and the final misadjustment that cannot guarantee the success of the training procedure in real applications. the choice of a different learning algorithm, based on a different minimization criterion, can help to overcome these drawbacks (see Azimi-Sajadi et al., 1992) and (scalero et al., 1992) for some examples of LS-based fast learning algorithms). The BRLS training algorithm next described allows to obtain considerable improvements from the point of view of both the numerical accuracy and the speed of convergence.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.