What is: Slot Attention?
Source | Object-Centric Learning with Slot Attention |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Slot Attention is an architectural component that interfaces with perceptual representations such as the output of a convolutional neural network and produces a set of task-dependent abstract representations which we call slots. These slots are exchangeable and can bind to any object in the input by specializing through a competitive procedure over multiple rounds of attention. Using an iterative attention mechanism, slots produces a set of output vectors with permutation symmetry. Unlike capsules used in Capsule Networks, slots produced by Slot Attention do not specialize to one particular type or class of object, which could harm generalization. Instead, they act akin to object files, i.e., slots use a common representational format: each slot can store (and bind to) any object in the input. This allows Slot Attention to generalize in a systematic way to unseen compositions, more objects, and more slots.