The task of sampling efficiently the Gibbs-Boltzmann distribution of disordered systems is important both for the theoretical understanding of these models and for the solution of practical optimization problems. Unfortunately, this task is known to be hard, especially for spin-glass-like problems at low temperatures. Recently, many attempts have been made to tackle the problem by mixing classical Monte Carlo schemes with newly devised neural networks that learn to propose smart moves. In this article, we introduce the nearest-neighbors neural network ( 4N ) architecture, a physically interpretable deep architecture whose number of parameters scales linearly with the size of the system and that can be applied to a large variety of topologies. We show that the 4N architecture can accurately learn the Gibbs-Boltzmann distribution for a prototypical spin-glass model, the two-dimensional Edwards-Anderson model, and specifically for some of its most difficult instances. In particular, it captures properties such as the energy, the correlation function and the overlap probability distribution. Finally, we show that the 4N performance increases with the number of layers, in a way that clearly connects to the correlation length of the system, thus providing a simple and interpretable criterion to choose the optimal depth.

Nearest-neighbours neural network architecture for efficient sampling of statistical physics models / Del Bono, Luca Maria; Ricci-Tersenghi, Federico; Zamponi, Francesco. - In: MACHINE LEARNING: SCIENCE AND TECHNOLOGY. - ISSN 2632-2153. - 6:2(2025), pp. 1-12. [10.1088/2632-2153/adcdc1]

Nearest-neighbours neural network architecture for efficient sampling of statistical physics models

Luca Maria Del Bono
Primo
;
Federico Ricci-Tersenghi;Francesco Zamponi
2025

Abstract

The task of sampling efficiently the Gibbs-Boltzmann distribution of disordered systems is important both for the theoretical understanding of these models and for the solution of practical optimization problems. Unfortunately, this task is known to be hard, especially for spin-glass-like problems at low temperatures. Recently, many attempts have been made to tackle the problem by mixing classical Monte Carlo schemes with newly devised neural networks that learn to propose smart moves. In this article, we introduce the nearest-neighbors neural network ( 4N ) architecture, a physically interpretable deep architecture whose number of parameters scales linearly with the size of the system and that can be applied to a large variety of topologies. We show that the 4N architecture can accurately learn the Gibbs-Boltzmann distribution for a prototypical spin-glass model, the two-dimensional Edwards-Anderson model, and specifically for some of its most difficult instances. In particular, it captures properties such as the energy, the correlation function and the overlap probability distribution. Finally, we show that the 4N performance increases with the number of layers, in a way that clearly connects to the correlation length of the system, thus providing a simple and interpretable criterion to choose the optimal depth.
2025
disordered systems; machine learning; Monte Carlo sampling; spin glass
01 Pubblicazione su rivista::01a Articolo in rivista
Nearest-neighbours neural network architecture for efficient sampling of statistical physics models / Del Bono, Luca Maria; Ricci-Tersenghi, Federico; Zamponi, Francesco. - In: MACHINE LEARNING: SCIENCE AND TECHNOLOGY. - ISSN 2632-2153. - 6:2(2025), pp. 1-12. [10.1088/2632-2153/adcdc1]
File allegati a questo prodotto
File Dimensione Formato  
DelBono_Nearest-neighbors-neural_2025.pdf

accesso aperto

Note: Articolo su rivista
Tipologia: Documento in Post-print (versione successiva alla peer review e accettata per la pubblicazione)
Licenza: Creative commons
Dimensione 772.81 kB
Formato Adobe PDF
772.81 kB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1743920
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 1
social impact