Classical methods for training feedforward neural networks are characterized by a number of shortcomings, first of all the slow rate of convergence and the occurrence of local minima. In this work a new learning algorithm is presented as a faster alternative to the Backpropagation method. The algorithm is based on the solution of a linearized system for each layer of the network performed by a block Total Least Squares technique. Simulation results are reported showing the high convergence speed of the new algorithm and its high degree of accuracy.
Total least squares approach for fast learning in multilayer neural networks / Parisi, Raffaele; DI CLAUDIO, Elio; Orlandi, Gianni. - STAMPA. - 1:(1995), pp. 474-477. (Intervento presentato al convegno IEEE 1995 International Symposium on Circuits and Systems (ISCAS’95) tenutosi a Seattle, WA - USA nel April 29-May 3, 1995) [10.1109/ISCAS.1995.521553].
Total least squares approach for fast learning in multilayer neural networks
PARISI, Raffaele;DI CLAUDIO, Elio;ORLANDI, Gianni
1995
Abstract
Classical methods for training feedforward neural networks are characterized by a number of shortcomings, first of all the slow rate of convergence and the occurrence of local minima. In this work a new learning algorithm is presented as a faster alternative to the Backpropagation method. The algorithm is based on the solution of a linearized system for each layer of the network performed by a block Total Least Squares technique. Simulation results are reported showing the high convergence speed of the new algorithm and its high degree of accuracy.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.