Yannic Kilcher explains the paper “Hopfield Networks is All You Need.”

Hopfield Networks are one of the classic models of biological memory networks. This paper generalizes modern Hopfield Networks to continuous states and shows that the corresponding update rule is equal to the attention mechanism used in modern Transformers. It further analyzes a pre-trained BERT model through the lens of Hopfield Networks and uses a Hopfield Attention Layer to perform Immune Repertoire Classification.

Content outline:

  • 0:00 – Intro & Overview
  • 1:35 – Binary Hopfield Networks
  • 5:55 – Continuous Hopfield Networks
  • 8:15 – Update Rules & Energy Functions
  • 13:30 – Connection to Transformers
  • 14:35 – Hopfield Attention Layers
  • 26:45 – Theoretical Analysis
  • 48:10 – Investigating BERT
  • 1:02:30 – Immune Repertoire Classification

Visual scenes are often comprised of sets of independent objects. Yet, current vision models make no assumptions about the nature of the pictures they look at.

Yannic Kilcher explore a paper on object-centric learning.

By imposing an objectness prior, this paper a module that is able to recognize permutation-invariant sets of objects from pixels in both supervised and unsupervised settings. It does so by introducing a slot attention module that combines an attention mechanism with dynamic routing.

Content index:

  • 0:00 – Intro & Overview
  • 1:40 – Problem Formulation
  • 4:30 – Slot Attention Architecture
  • 13:30 – Slot Attention Algorithm
  • 21:30 – Iterative Routing Visualization
  • 29:15 – Experiments
  • 36:20 – Inference Time Flexibility
  • 38:35 – Broader Impact Statement
  • 42:05 – Conclusion & Comments