The Hopfield model is a paradigmatic model of neural networks that has been analyzed for many decades in the statistical physics, neuroscience, and machine learning communities. Inspired by the manifold hypothesis in machine learning, we propose and investigate a generalization of the standard setting that we name random-features Hopfield model. Here, P binary patterns of length N are generated by applying to Gaussian vectors sampled in a latent space of dimension D a random projection followed by a nonlinearity. Using the replica method from statistical physics, we derive the phase diagram of the model in the limit P; N; D ->infinity with fixed ratios alpha = P/N and alpha(D) = D/N. Besides the usual retrieval phase, where the patterns can be dynamically recovered from some initial corruption, we uncover a new phase where the features characterizing the projection can be recovered instead. We call this phenomena the learning phase transition, as the features are not explicitly given to the model but rather are inferred from the patterns in an unsupervised fashion.

Storage and learning phase transitions in the random-features Hopfield model / Negri, M.; Lauditi, C.; Perugini, G.; Lucibello, C.; Malatesta, E.. - In: PHYSICAL REVIEW LETTERS. - ISSN 1079-7114. - 131:25(2023), pp. 1-6. [10.1103/PhysRevLett.131.257301]

Storage and learning phase transitions in the random-features Hopfield model

M. Negri
;
C. Lauditi;C. Lucibello;
2023

Abstract

The Hopfield model is a paradigmatic model of neural networks that has been analyzed for many decades in the statistical physics, neuroscience, and machine learning communities. Inspired by the manifold hypothesis in machine learning, we propose and investigate a generalization of the standard setting that we name random-features Hopfield model. Here, P binary patterns of length N are generated by applying to Gaussian vectors sampled in a latent space of dimension D a random projection followed by a nonlinearity. Using the replica method from statistical physics, we derive the phase diagram of the model in the limit P; N; D ->infinity with fixed ratios alpha = P/N and alpha(D) = D/N. Besides the usual retrieval phase, where the patterns can be dynamically recovered from some initial corruption, we uncover a new phase where the features characterizing the projection can be recovered instead. We call this phenomena the learning phase transition, as the features are not explicitly given to the model but rather are inferred from the patterns in an unsupervised fashion.
2023
associative memories; artificial neural networks; disordered systems
01 Pubblicazione su rivista::01a Articolo in rivista
Storage and learning phase transitions in the random-features Hopfield model / Negri, M.; Lauditi, C.; Perugini, G.; Lucibello, C.; Malatesta, E.. - In: PHYSICAL REVIEW LETTERS. - ISSN 1079-7114. - 131:25(2023), pp. 1-6. [10.1103/PhysRevLett.131.257301]
File allegati a questo prodotto
File Dimensione Formato  
Negri_Storage-and-learning_2023.pdf

solo gestori archivio

Note: Articolo su rivista
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 364.55 kB
Formato Adobe PDF
364.55 kB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1709756
Citazioni
  • ???jsp.display-item.citation.pmc??? 0
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 7
social impact