What is: Linformer?
Source | Linformer: Self-Attention with Linear Complexity |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Linformer is a linear Transformer that utilises a linear self-attention mechanism to tackle the self-attention bottleneck with Transformer models. The original scaled dot-product attention is decomposed into multiple smaller attentions through linear projections, such that the combination of these operations forms a low-rank factorization of the original attention.