: The dreaming Hopfield model constitutes a generalization of the Hebbian paradigm for neural networks, that is able to perform on-line learning when "awake" and also to account for off-line "sleeping" mechanisms. The latter have been shown to enhance storing in such a way that, in the long sleep-time limit, this model can reach the maximal storage capacity achievable by networks equipped with symmetric pairwise interactions. In this paper, we inspect the minimal amount of information that must be supplied to such a network to guarantee a successful generalization, and we test it both on random synthetic and on standard structured datasets (i.e., MNIST, Fashion-MNIST and Olivetti). By comparing these minimal thresholds of information with those required by the standard (i.e., always "awake") Hopfield model, we prove that the present network can save up to ∼90% of the dataset size, yet preserving the same performance of the standard counterpart. This suggests that sleep may play a pivotal role in explaining the gap between the large volumes of data required to train artificial neural networks and the relatively small volumes needed by their biological counterparts. Further, we prove that the model Cost function (typically used in statistical mechanics) admits a representation in terms of a standard Loss function (typically used in machine learning) and this allows us to analyze its emergent computational skills both theoretically and computationally: a quantitative picture of its capabilities as a function of its control parameters is achieved and consistency between the two approaches is highlighted. The resulting network is an associative memory for pattern recognition tasks that learns from examples on-line, generalizes correctly (in suitable regions of its control parameters) and optimizes its storage capacity by off-line sleeping: such a reduction of the training cost can be inspiring toward sustainable AI and in situations where data are relatively sparse.

Hebbian dreaming for small datasets / Agliari, Elena; Alemanno, Francesco; Aquaro, Miriam; Barra, Adriano; Durante, Fabrizio; Kanter, Ido. - In: NEURAL NETWORKS. - ISSN 0893-6080. - 173:(2024). [10.1016/j.neunet.2024.106174]

Hebbian dreaming for small datasets

Agliari, Elena;Aquaro, Miriam;Barra, Adriano;
2024

Abstract

: The dreaming Hopfield model constitutes a generalization of the Hebbian paradigm for neural networks, that is able to perform on-line learning when "awake" and also to account for off-line "sleeping" mechanisms. The latter have been shown to enhance storing in such a way that, in the long sleep-time limit, this model can reach the maximal storage capacity achievable by networks equipped with symmetric pairwise interactions. In this paper, we inspect the minimal amount of information that must be supplied to such a network to guarantee a successful generalization, and we test it both on random synthetic and on standard structured datasets (i.e., MNIST, Fashion-MNIST and Olivetti). By comparing these minimal thresholds of information with those required by the standard (i.e., always "awake") Hopfield model, we prove that the present network can save up to ∼90% of the dataset size, yet preserving the same performance of the standard counterpart. This suggests that sleep may play a pivotal role in explaining the gap between the large volumes of data required to train artificial neural networks and the relatively small volumes needed by their biological counterparts. Further, we prove that the model Cost function (typically used in statistical mechanics) admits a representation in terms of a standard Loss function (typically used in machine learning) and this allows us to analyze its emergent computational skills both theoretically and computationally: a quantitative picture of its capabilities as a function of its control parameters is achieved and consistency between the two approaches is highlighted. The resulting network is an associative memory for pattern recognition tasks that learns from examples on-line, generalizes correctly (in suitable regions of its control parameters) and optimizes its storage capacity by off-line sleeping: such a reduction of the training cost can be inspiring toward sustainable AI and in situations where data are relatively sparse.
2024
Hebbian learning; Hopfield model; sleeping phenomena; statistical mechanics
01 Pubblicazione su rivista::01a Articolo in rivista
Hebbian dreaming for small datasets / Agliari, Elena; Alemanno, Francesco; Aquaro, Miriam; Barra, Adriano; Durante, Fabrizio; Kanter, Ido. - In: NEURAL NETWORKS. - ISSN 0893-6080. - 173:(2024). [10.1016/j.neunet.2024.106174]
File allegati a questo prodotto
File Dimensione Formato  
Agliari_Hebbian_dreaming_2024.pdf

accesso aperto

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 1.56 MB
Formato Adobe PDF
1.56 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1705865
Citazioni
  • ???jsp.display-item.citation.pmc??? 0
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
social impact