Neural networks are nowadays both powerful operational tools (e.g., for pattern recognition, data mining, error correction codes) and complex theoretical models on the focus of scientific investigation. As for the research branch, neural networks are handled and studied by psychologists, neurobiologists, engineers, mathematicians and theoretical physicists. In particular, in theoretical physics, the key instrument for the quantitative analysis of neural networks is statistical mechanics. From this perspective, here, we review attractor networks: starting from ferromagnets and spin-glass models, we discuss the underlying philosophy and we recover the strand paved by Hopfield, Amit-Gutfreund-Sompolinky. As a sideline, in this walk we derive an alternative (with respect to the original Hebb proposal) way to recover the Hebbian paradigm, stemming from mixing ferromagnets with spin-glasses. Further, as these notes are thought of for an Engineering audience, we highlight also the mappings between ferromagnets and operational amplifiers, hoping that such a bridge plays as a concrete prescription to capture the beauty of robotics from the statistical mechanical perspective.

A walk in the statistical mechanical formulation of neural networks / Agliari, Elena; Barra, Adriano; Galluzzi, Andrea; Tantari, Daniele; Tavani, Flavia. - STAMPA. - (2014), pp. 210-217. (Intervento presentato al convegno NCTA2014: Neural computation theory & application tenutosi a Roma) [10.5220/0005077902100217].

A walk in the statistical mechanical formulation of neural networks

AGLIARI, ELENA;BARRA, ADRIANO;
2014

Abstract

Neural networks are nowadays both powerful operational tools (e.g., for pattern recognition, data mining, error correction codes) and complex theoretical models on the focus of scientific investigation. As for the research branch, neural networks are handled and studied by psychologists, neurobiologists, engineers, mathematicians and theoretical physicists. In particular, in theoretical physics, the key instrument for the quantitative analysis of neural networks is statistical mechanics. From this perspective, here, we review attractor networks: starting from ferromagnets and spin-glass models, we discuss the underlying philosophy and we recover the strand paved by Hopfield, Amit-Gutfreund-Sompolinky. As a sideline, in this walk we derive an alternative (with respect to the original Hebb proposal) way to recover the Hebbian paradigm, stemming from mixing ferromagnets with spin-glasses. Further, as these notes are thought of for an Engineering audience, we highlight also the mappings between ferromagnets and operational amplifiers, hoping that such a bridge plays as a concrete prescription to capture the beauty of robotics from the statistical mechanical perspective.
2014
NCTA2014: Neural computation theory & application
Neural computation, artificial neural networks
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
A walk in the statistical mechanical formulation of neural networks / Agliari, Elena; Barra, Adriano; Galluzzi, Andrea; Tantari, Daniele; Tavani, Flavia. - STAMPA. - (2014), pp. 210-217. (Intervento presentato al convegno NCTA2014: Neural computation theory & application tenutosi a Roma) [10.5220/0005077902100217].
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/714864
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? ND
social impact