What is: Sinkhorn Transformer?
Source | Sparse Sinkhorn Attention |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
The Sinkhorn Transformer is a type of transformer that uses Sparse Sinkhorn Attention as a building block. This component is a plug-in replacement for dense fully-connected attention (as well as local attention, and sparse attention alternatives), and allows for reduced memory complexity as well as sparse attention.