Echo State Networks (ESNs) were introduced to simplify the design and training of Recurrent Neural Networks (RNNs), by explicitly subdividing the recurrent part of the network, the reservoir, from the non-recurrent part. A standard practice in this context is the random initialization of the reservoir, subject to few loose constraints. Although this results in a simple-to-solve optimization problem, it is in general suboptimal, and several additional criteria have been devised to improve its design. In this paper we provide an effective algorithm for removing redundant connections inside the reservoir during training. The algorithm is based on the correlation of the states of the nodes, hence it depends only on the input signal, it is efficient to implement, and it is also local. By applying it, we can obtain an optimally sparse reservoir in a robust way. We present the performance of our algorithm on two synthetic datasets, which show its effectiveness in terms of better generalization and lower computational complexity of the resulting ESN. This behavior is also investigated for increasing levels of memory and non-linearity required by the task. © 2014 IEEE.
An Effective Criterion for Pruning Reservoir’s Connections in Echo State Networks / Scardapane, Simone; G., Nocco; Comminiello, Danilo; Scarpiniti, Michele; Uncini, Aurelio. - (2014), pp. 1205-1212. (Intervento presentato al convegno International Joint Conference on Neural Networks tenutosi a Beijing; China nel 6-11 July) [10.1109/IJCNN.2014.6889600].
An Effective Criterion for Pruning Reservoir’s Connections in Echo State Networks
SCARDAPANE, SIMONE;COMMINIELLO, DANILO;SCARPINITI, MICHELE;UNCINI, Aurelio
2014
Abstract
Echo State Networks (ESNs) were introduced to simplify the design and training of Recurrent Neural Networks (RNNs), by explicitly subdividing the recurrent part of the network, the reservoir, from the non-recurrent part. A standard practice in this context is the random initialization of the reservoir, subject to few loose constraints. Although this results in a simple-to-solve optimization problem, it is in general suboptimal, and several additional criteria have been devised to improve its design. In this paper we provide an effective algorithm for removing redundant connections inside the reservoir during training. The algorithm is based on the correlation of the states of the nodes, hence it depends only on the input signal, it is efficient to implement, and it is also local. By applying it, we can obtain an optimally sparse reservoir in a robust way. We present the performance of our algorithm on two synthetic datasets, which show its effectiveness in terms of better generalization and lower computational complexity of the resulting ESN. This behavior is also investigated for increasing levels of memory and non-linearity required by the task. © 2014 IEEE.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.