We consider a three-layer Sejnowski machine and show that features learnt via contrastive divergence have a dual representation as patterns in a dense associative memory of order P = 4. The latter is known to be able to Hebbian store an amount of patterns scaling as NP -1, where N denotes the number of constituting binary neurons interacting P wisely. We also prove that, by keeping the dense associative network far from the saturation regime (namely, allowing for a number of patterns scaling only linearly with N, while P > 2) such a system is able to perform pattern recognition far below the standard signal-to-noise threshold. In particular, a network with P = 4 is able to retrieve information whose intensity is O(1) even in the presence of a noise O(root N) in the large N limit. This striking skill stems from a redundancy representation of patterns-which is afforded given the (relatively) low-load information storage-and it contributes to explain the impressive abilities in pattern recognition exhibited by new-generation neural networks. The whole theory is developed rigorously, at the replica symmetric level of approximation, and corroborated by signal-to-noise analysis and Monte Carlo simulations.

Neural Networks with a Redundant Representation: Detecting the Undetectable / Agliari, Elena; Alemanno, Francesco; Barra, Adriano; Centonze, Martino; Fachechi, Alberto. - In: PHYSICAL REVIEW LETTERS. - ISSN 0031-9007. - 124:2(2020). [10.1103/PhysRevLett.124.028301]

Neural Networks with a Redundant Representation: Detecting the Undetectable

Agliari, Elena
;
Fachechi, Alberto
2020

Abstract

We consider a three-layer Sejnowski machine and show that features learnt via contrastive divergence have a dual representation as patterns in a dense associative memory of order P = 4. The latter is known to be able to Hebbian store an amount of patterns scaling as NP -1, where N denotes the number of constituting binary neurons interacting P wisely. We also prove that, by keeping the dense associative network far from the saturation regime (namely, allowing for a number of patterns scaling only linearly with N, while P > 2) such a system is able to perform pattern recognition far below the standard signal-to-noise threshold. In particular, a network with P = 4 is able to retrieve information whose intensity is O(1) even in the presence of a noise O(root N) in the large N limit. This striking skill stems from a redundancy representation of patterns-which is afforded given the (relatively) low-load information storage-and it contributes to explain the impressive abilities in pattern recognition exhibited by new-generation neural networks. The whole theory is developed rigorously, at the replica symmetric level of approximation, and corroborated by signal-to-noise analysis and Monte Carlo simulations.
2020
machine learning, neural networks, statistical mechanics, disordered systems
01 Pubblicazione su rivista::01a Articolo in rivista
Neural Networks with a Redundant Representation: Detecting the Undetectable / Agliari, Elena; Alemanno, Francesco; Barra, Adriano; Centonze, Martino; Fachechi, Alberto. - In: PHYSICAL REVIEW LETTERS. - ISSN 0031-9007. - 124:2(2020). [10.1103/PhysRevLett.124.028301]
File allegati a questo prodotto
File Dimensione Formato  
Agliari_Neural-networks_2020.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 308.31 kB
Formato Adobe PDF
308.31 kB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1356738
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 21
  • ???jsp.display-item.citation.isi??? 19
social impact