We propose and analyze a new variation of the so-called exponential Hopfield model, a recently introduced family of associative neural networks with unprecedented storage capacity. Our construction is based on a cost function defined through exponentials of standard Mean Squared Error (MSE) loss function per pattern, which naturally favors configurations corresponding to perfect recall. Despite not being a mean-field system, the model admits a tractable mathematical analysis of its dynamics and retrieval properties that agree with those for the original exponential model introduced by Ramsauer and coworkers. By means of a signal-to-noise approach, we demonstrate that stored patterns remain stable fixed points of the zero-temperature dynamics up to an exponentially large number of patterns in the system size. We further quantify the basins of attraction of the retrieved memories, showing that while enlarging their radius reduces the overall load, the storage capacity nonetheless retains its exponential scaling. An independent derivation in the perfect recall regime confirms these results and provides an estimate of the relevant prefactors. We also compare typical case (as standard in statistical mechanics) vs worst case (as standard in machine learning) recall criteria, finding an exponential storage capacity even for the latter case. Our findings thus complement and extend previous studies on exponential Hopfield networks, establishing that even under robustness constraints these models preserve their exceptional storage capabilities. Beyond their theoretical interest, such networks point towards principled mechanisms for massively scalable associative memory, potentially offering a theoretical way out of the storage-bottleneck problem caused by the current trend of digital data production doubling roughly every couple of years. As an illustration, we show that in order to store 150 zettabytes, i.e. approximately all digital data stored worldwide at present, an exponential Hopfield model of the proposed type with less than a hundred neurons would suffice.
Yet another exponential Hopfield model / Albanese, Linda; Alessandrelli, Andrea; Barra, Adriano; Sollich, Peter. - In: PHYSICA. A. - ISSN 0378-4371. - 683:(2026). [10.1016/j.physa.2025.131223]
Yet another exponential Hopfield model
Alessandrelli, Andrea;Barra, Adriano
;
2026
Abstract
We propose and analyze a new variation of the so-called exponential Hopfield model, a recently introduced family of associative neural networks with unprecedented storage capacity. Our construction is based on a cost function defined through exponentials of standard Mean Squared Error (MSE) loss function per pattern, which naturally favors configurations corresponding to perfect recall. Despite not being a mean-field system, the model admits a tractable mathematical analysis of its dynamics and retrieval properties that agree with those for the original exponential model introduced by Ramsauer and coworkers. By means of a signal-to-noise approach, we demonstrate that stored patterns remain stable fixed points of the zero-temperature dynamics up to an exponentially large number of patterns in the system size. We further quantify the basins of attraction of the retrieved memories, showing that while enlarging their radius reduces the overall load, the storage capacity nonetheless retains its exponential scaling. An independent derivation in the perfect recall regime confirms these results and provides an estimate of the relevant prefactors. We also compare typical case (as standard in statistical mechanics) vs worst case (as standard in machine learning) recall criteria, finding an exponential storage capacity even for the latter case. Our findings thus complement and extend previous studies on exponential Hopfield networks, establishing that even under robustness constraints these models preserve their exceptional storage capabilities. Beyond their theoretical interest, such networks point towards principled mechanisms for massively scalable associative memory, potentially offering a theoretical way out of the storage-bottleneck problem caused by the current trend of digital data production doubling roughly every couple of years. As an illustration, we show that in order to store 150 zettabytes, i.e. approximately all digital data stored worldwide at present, an exponential Hopfield model of the proposed type with less than a hundred neurons would suffice.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


