What is: LeViT Attention Block?
Source | LeViT: a Vision Transformer in ConvNet's Clothing for Faster Inference |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
LeViT Attention Block is a module used for attention in the LeViT architecture. Its main feature is providing positional information within each attention block, i.e. where we explicitly inject relative position information in the attention mechanism. This is achieved by adding an attention bias to the attention maps.