A novel learning technique is described as a faster and more reliable alternative to the classical back-propagation method. The approach is based on the application of Least Squares criterion to a linearized system at each step of the learning procedure. The squared error at the output of each layer immediately before the non linearity is minimized over the entire training set by a Block recursive Least Squares algorithm. The optimal weights (in the sense of minimal 2-norm of the error) are computed for each layer using the QR decomposition. The high performance of the new algorithm has been verified in several experimental trials, yielding considerable improvements from the point of view of both the accuracy and the speed of convergence.
Block-recursive least squares technique for training multilayer perceptrons / DI CLAUDIO, Elio; Parisi, Raffaele; Orlandi, Gianni. - STAMPA. - III:(1994), pp. 528-532. (Intervento presentato al convegno 1994 World Conference on Neural Networks (WCNN’94) tenutosi a San Diego - USA).
Block-recursive least squares technique for training multilayer perceptrons
DI CLAUDIO, Elio;PARISI, Raffaele;ORLANDI, Gianni
1994
Abstract
A novel learning technique is described as a faster and more reliable alternative to the classical back-propagation method. The approach is based on the application of Least Squares criterion to a linearized system at each step of the learning procedure. The squared error at the output of each layer immediately before the non linearity is minimized over the entire training set by a Block recursive Least Squares algorithm. The optimal weights (in the sense of minimal 2-norm of the error) are computed for each layer using the QR decomposition. The high performance of the new algorithm has been verified in several experimental trials, yielding considerable improvements from the point of view of both the accuracy and the speed of convergence.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.