Echo State Networks (ESNs) were introduced to simplify the design and training of Recurrent Neural Networks (RNNs), by explicitly subdividing the recurrent part of the network, the reservoir, from the non-recurrent part. A standard practice in this context is the random initialization of the reservoir, subject to few loose constraints. Although this results in a simple-to-solve optimization problem, it is in general suboptimal, and several additional criteria have been devised to improve its design. In this paper we provide an effective algorithm for removing redundant connections inside the reservoir during training. The algorithm is based on the correlation of the states of the nodes, hence it depends only on the input signal, it is efficient to implement, and it is also local. By applying it, we can obtain an optimally sparse reservoir in a robust way. We present the performance of our algorithm on two synthetic datasets, which show its effectiveness in terms of better generalization and lower computational complexity of the resulting ESN. This behavior is also investigated for increasing levels of memory and non-linearity required by the task. © 2014 IEEE.

An Effective Criterion for Pruning Reservoir’s Connections in Echo State Networks / Scardapane, Simone; G., Nocco; Comminiello, Danilo; Scarpiniti, Michele; Uncini, Aurelio. - (2014), pp. 1205-1212. (Intervento presentato al convegno International Joint Conference on Neural Networks tenutosi a Beijing; China nel 6-11 July) [10.1109/IJCNN.2014.6889600].

An Effective Criterion for Pruning Reservoir’s Connections in Echo State Networks

SCARDAPANE, SIMONE;COMMINIELLO, DANILO;SCARPINITI, MICHELE;UNCINI, Aurelio
2014

Abstract

Echo State Networks (ESNs) were introduced to simplify the design and training of Recurrent Neural Networks (RNNs), by explicitly subdividing the recurrent part of the network, the reservoir, from the non-recurrent part. A standard practice in this context is the random initialization of the reservoir, subject to few loose constraints. Although this results in a simple-to-solve optimization problem, it is in general suboptimal, and several additional criteria have been devised to improve its design. In this paper we provide an effective algorithm for removing redundant connections inside the reservoir during training. The algorithm is based on the correlation of the states of the nodes, hence it depends only on the input signal, it is efficient to implement, and it is also local. By applying it, we can obtain an optimally sparse reservoir in a robust way. We present the performance of our algorithm on two synthetic datasets, which show its effectiveness in terms of better generalization and lower computational complexity of the resulting ESN. This behavior is also investigated for increasing levels of memory and non-linearity required by the task. © 2014 IEEE.
2014
International Joint Conference on Neural Networks
Echo State Network; Reservoir Computing; Pruning; Significance
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
An Effective Criterion for Pruning Reservoir’s Connections in Echo State Networks / Scardapane, Simone; G., Nocco; Comminiello, Danilo; Scarpiniti, Michele; Uncini, Aurelio. - (2014), pp. 1205-1212. (Intervento presentato al convegno International Joint Conference on Neural Networks tenutosi a Beijing; China nel 6-11 July) [10.1109/IJCNN.2014.6889600].
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/601592
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 13
  • ???jsp.display-item.citation.isi??? 11
social impact