Echo State Networks (ESNs) are a family of Recurrent Neural Networks (RNNs), that can be trained efficiently and robustly. Their main characteristic is the partitioning of the recurrent part of the network, the reservoir, from the non-recurrent part, the latter being the only component which is explicitly trained. To ensure good generalization capabilities, the reservoir is generally built from a large number of neurons, whose connectivity should be designed in a sparse pattern. Recently, we proposed an unsupervised online criterion for performing this sparsification process, based on the idea of significance of a synapse, i.e., an approximate measure of its importance in the network. In this paper, we extend our criterion to the direct pruning of neurons inside the reservoir, by defining the significance of a neuron in terms of the significance of its neighboring synapses. Our experimental validation shows that, by combining pruning of neurons and synapses, we are able to obtain an optimally sparse ESN in an efficient way. In addition, we briefly investigate the resulting reservoir’s topologies deriving from the application of our procedure.

Significance-based pruning for reservoir’s neurons in echo state networks / Scardapane, Simone; Comminiello, Danilo; Scarpiniti, Michele; Uncini, Aurelio. - STAMPA. - 37(2015), pp. 31-38. [10.1007/978-3-319-18164-6_4].

Significance-based pruning for reservoir’s neurons in echo state networks

SCARDAPANE, SIMONE;COMMINIELLO, DANILO;SCARPINITI, MICHELE;UNCINI, Aurelio
2015

Abstract

Echo State Networks (ESNs) are a family of Recurrent Neural Networks (RNNs), that can be trained efficiently and robustly. Their main characteristic is the partitioning of the recurrent part of the network, the reservoir, from the non-recurrent part, the latter being the only component which is explicitly trained. To ensure good generalization capabilities, the reservoir is generally built from a large number of neurons, whose connectivity should be designed in a sparse pattern. Recently, we proposed an unsupervised online criterion for performing this sparsification process, based on the idea of significance of a synapse, i.e., an approximate measure of its importance in the network. In this paper, we extend our criterion to the direct pruning of neurons inside the reservoir, by defining the significance of a neuron in terms of the significance of its neighboring synapses. Our experimental validation shows that, by combining pruning of neurons and synapses, we are able to obtain an optimally sparse ESN in an efficient way. In addition, we briefly investigate the resulting reservoir’s topologies deriving from the application of our procedure.
2015
Advances in Neural Networks: Computational and Theoretical Issues
978-3-319-18163-9
Echo State Networks; recurrent neural networks; pruning; least-square
02 Pubblicazione su volume::02a Capitolo o Articolo
Significance-based pruning for reservoir’s neurons in echo state networks / Scardapane, Simone; Comminiello, Danilo; Scarpiniti, Michele; Uncini, Aurelio. - STAMPA. - 37(2015), pp. 31-38. [10.1007/978-3-319-18164-6_4].
File allegati a questo prodotto
File Dimensione Formato  
Scardapane_Significance-based-pruning_2015.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 195.46 kB
Formato Adobe PDF
195.46 kB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/785980
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 6
  • ???jsp.display-item.citation.isi??? ND
social impact