In this paper, we study the properties of neural networks based on adaptive spline activation functions. Using the results of regularization theory, we show how the proposed architecture is able to produce smooth approximations of unknown functions; to reduce hardware complexity a particular implementation of the kernels expected by the theory is suggested. This solution, although sub-optimal, greatly reduces the number of neurons and connections as it gives an increased expressive power to each neuron which is also able to produce a smooth activation function just controlling one fixed parameter of a Catmull-Rom cubic spline. Experimental results demonstrate that there is also an advantage in terms of the number of free parameters that, together with smoothness, leads to an improved generalization capability.
Approximation capabilities of adaptive spline neural networks / Vecci, L; Campolucci, P; Piazza, F; Uncini, Aurelio. - 1:(1997), pp. 260-265. [10.1109/ICNN.1997.611675]
Approximation capabilities of adaptive spline neural networks
UNCINI, Aurelio
1997
Abstract
In this paper, we study the properties of neural networks based on adaptive spline activation functions. Using the results of regularization theory, we show how the proposed architecture is able to produce smooth approximations of unknown functions; to reduce hardware complexity a particular implementation of the kernels expected by the theory is suggested. This solution, although sub-optimal, greatly reduces the number of neurons and connections as it gives an increased expressive power to each neuron which is also able to produce a smooth activation function just controlling one fixed parameter of a Catmull-Rom cubic spline. Experimental results demonstrate that there is also an advantage in terms of the number of free parameters that, together with smoothness, leads to an improved generalization capability.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.