000 | 01963 am a22002293u 4500 | ||
---|---|---|---|
042 | _adc | ||
100 | 1 | 0 |
_aMillidge, Beren _eauthor _92821 |
700 | 1 | 0 |
_aSalvatori, Tommaso _eauthor _92822 |
700 | 1 | 0 |
_aSong, Yuhang _eauthor _92823 |
700 | 1 | 0 |
_aLukasiewicz, Thomas _eauthor _92824 |
700 | 1 | 0 |
_aBogacz, Rafal _eauthor _92825 |
245 | 0 | 0 | _aUniversal Hopfield Networks: A General Framework for Single-Shot Associative Memory Models |
260 | _c2022-07. | ||
500 | _a/pmc/articles/PMC7614148/ | ||
520 | _aA large number of neural network models of associative memory have been proposed in the literature. These include the classical Hopfield networks (HNs), sparse distributed memories (SDMs), and more recently the modern continuous Hopfield networks (MCHNs), which possess close links with self-attention in machine learning. In this paper, we propose a general framework for understanding the operation of such memory networks as a sequence of three operations: similarity, separation, and projection. We derive all these memory models as instances of our general framework with differing similarity and separation functions. We extend the mathematical framework of Krotov & Hopfield (2020) to express general associative memory models using neural network dynamics with local computation, and derive a general energy function that is a Lyapunov function of the dynamics. Finally, using our framework, we empirically investigate the capacity of using different similarity functions for these associative memory models, beyond the dot product similarity measure, and demonstrate empirically that Euclidean or Manhattan distance similarity metrics perform substantially better in practice on many tasks, enabling a more robust retrieval and higher memory capacity than existing models. | ||
540 | _a | ||
546 | _aen | ||
690 | _aArticle | ||
655 | 7 |
_aText _2local |
|
786 | 0 | _nProc Mach Learn Res | |
856 | 4 | 1 |
_u/pubmed/36751405 _zConnect to this object online. |
999 |
_c964 _d964 |