The paper introduces multilayer perceptrons with weight values restricted to powers-of-two or sum of power-of-two. In a digital implementation, these neural networks do not need multipliers but only shift registers when computing in forward mode, thus saving chip area and computation time. A learning procedure, based on back-propagation, is presented for such neural networks. This learning procedure requires full real arithmetic and therefore must be performed off-line. Some test cases are presented, concerning MLP's with hidden layers of different size, on pattern recognition problems. Such tests demonstrate the validity and the generalization capability of the method and give some insight into the behavior of the learning algorithm.
FAST NEURAL NETWORKS WITHOUT MULTIPLIERS / Michele, Marchesi; Orlandi, Gianni; Francesco, Piazza; Uncini, Aurelio. - In: IEEE TRANSACTIONS ON NEURAL NETWORKS. - ISSN 1045-9227. - 4:1(1993), pp. 53-62. [10.1109/72.182695]
FAST NEURAL NETWORKS WITHOUT MULTIPLIERS
ORLANDI, Gianni;UNCINI, Aurelio
1993
Abstract
The paper introduces multilayer perceptrons with weight values restricted to powers-of-two or sum of power-of-two. In a digital implementation, these neural networks do not need multipliers but only shift registers when computing in forward mode, thus saving chip area and computation time. A learning procedure, based on back-propagation, is presented for such neural networks. This learning procedure requires full real arithmetic and therefore must be performed off-line. Some test cases are presented, concerning MLP's with hidden layers of different size, on pattern recognition problems. Such tests demonstrate the validity and the generalization capability of the method and give some insight into the behavior of the learning algorithm.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.