To improve the storage capacity of the Hopfield model, we develop aversion of the dreaming algorithm that perpetually reinforces the patterns to be stored (as in the Hebb rule), and erases the spurious memories (as in dreaming algorithms). For this reason, we called it Daydreaming. Daydreaming is not destructive and it converges asymptotically to stationary retrieval maps. When trained on random uncorrelated examples, the model shows optimal performance in terms of the size of the basins of attraction of stored examples and the quality of reconstruction. We also train the Daydreaming algorithm on correlated data obtained via the random-features model and argue that it spontaneously exploits the correlations thus increasing even further the storage capacity and the size of the basins of attraction. Moreover, the Daydreaming algorithm is also able to stabilize the features hidden in the data. Finally, we test Daydreaming on the MNIST dataset and show that it still works surprisingly well, producing attractors that are close to unseen examples and class prototypes.

Daydreaming Hopfield networks and their surprising effectiveness on correlated data / Serricchio, L.; Bocchi, D.; Chilin, C.; Marino, R.; Negri, M.; Cammarota, C.; Ricci-Tersenghi, F.. - In: NEURAL NETWORKS. - ISSN 0893-6080. - 186:(2025), pp. 1-12. [10.1016/j.neunet.2025.107216]

Daydreaming Hopfield networks and their surprising effectiveness on correlated data

Serricchio L.;Bocchi D.;Chilin C.;Marino R.;Negri M.
;
Cammarota C.;Ricci-Tersenghi F.
2025

Abstract

To improve the storage capacity of the Hopfield model, we develop aversion of the dreaming algorithm that perpetually reinforces the patterns to be stored (as in the Hebb rule), and erases the spurious memories (as in dreaming algorithms). For this reason, we called it Daydreaming. Daydreaming is not destructive and it converges asymptotically to stationary retrieval maps. When trained on random uncorrelated examples, the model shows optimal performance in terms of the size of the basins of attraction of stored examples and the quality of reconstruction. We also train the Daydreaming algorithm on correlated data obtained via the random-features model and argue that it spontaneously exploits the correlations thus increasing even further the storage capacity and the size of the basins of attraction. Moreover, the Daydreaming algorithm is also able to stabilize the features hidden in the data. Finally, we test Daydreaming on the MNIST dataset and show that it still works surprisingly well, producing attractors that are close to unseen examples and class prototypes.
2025
Hopfield networks; statistical mechanics; structured data; unlearning
01 Pubblicazione su rivista::01a Articolo in rivista
Daydreaming Hopfield networks and their surprising effectiveness on correlated data / Serricchio, L.; Bocchi, D.; Chilin, C.; Marino, R.; Negri, M.; Cammarota, C.; Ricci-Tersenghi, F.. - In: NEURAL NETWORKS. - ISSN 0893-6080. - 186:(2025), pp. 1-12. [10.1016/j.neunet.2025.107216]
File allegati a questo prodotto
File Dimensione Formato  
Serricchio_Daydreaming-Hopfield-networks_2025.pdf

accesso aperto

Note: Articolo su rivista
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 2.5 MB
Formato Adobe PDF
2.5 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1736930
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact