Over the past few years there has been a resurgence of interest in Hopfield networks of associative memory. Dmitry Krotov discusses recent theoretical advances and their broader impact in the context of energy-based neural architectures.
This is a preview of subscription content, access via your institution
Relevant articles
Open Access articles citing this article.
-
A neural network-based model framework for cell-fate decisions and development
Communications Biology Open Access 14 March 2024
-
Photonic Stochastic Emergent Storage for deep classification by scattering-intrinsic patterns
Nature Communications Open Access 13 January 2024
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$29.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 digital issues and online access to articles
$99.00 per year
only $8.25 per issue
Buy this article
- Purchase on Springer Link
- Instant access to full article PDF
Prices may be subject to local taxes which are calculated during checkout
References
Hopfield, J. J. Neural networks and physical systems with emergent collective computational abilities. PNAS 79, 2554–2558 (1982).
Hopfield, J. J. Neurons with graded response have collective computational properties like those of two-state neurons. PNAS 81, 3088–3092 (1984).
Amit, D. J., Gutfreund, H. & Sompolinsky, H. Storing infinite numbers of patterns in a spin-glass model of neural networks. Phys. Rev. Lett. 55, 1530 (1985).
Krotov, D. & Hopfield, J. J. Dense associative memory for pattern recognition. In Advances in Neural Information Processing Systems 29 (NIPS, 2016).
Demircigil, M., Heusel, J., Löwe, M., Upgang, S. & Vermet, F. On a model of associative memory with huge storage capacity. J. Stat. Phys. 168, 288–299 (2017).
Ramsauer, H. et al. Hopfield networks is all you need. In International Conference on Learning Representations (ICLR, 2021).
Vaswani, A. et al. Attention is all you need. In Advances in Neural Information Processing Systems 30 (NIPS, 2017).
Hoover, B. et al. Energy transformer. Preprint at https://arxiv.org/abs/2302.07253 (2023).
Krotov, D. & Hopfield, J. J. Large associative memory problem in neurobiology and machine learning. In International Conference on Learning Representations (ICLR, 2021).
LeCun, Y., Chopra, S., Hadsell, R., Ranzato, M. & Huang, F. A tutorial on energy-based learning. In Predicting Structured Data (MIT Press, 2007).
Acknowledgements
I thank B. Hoover and J. Hopfield for numerous discussions over several years of the ideas described in this Comment.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing interests
The author declares no competing interests.
Rights and permissions
About this article
Cite this article
Krotov, D. A new frontier for Hopfield networks. Nat Rev Phys 5, 366–367 (2023). https://doi.org/10.1038/s42254-023-00595-y
Published:
Issue Date:
DOI: https://doi.org/10.1038/s42254-023-00595-y
This article is cited by
-
A neural network-based model framework for cell-fate decisions and development
Communications Biology (2024)
-
Photonic Stochastic Emergent Storage for deep classification by scattering-intrinsic patterns
Nature Communications (2024)