What is: Locality Sensitive Hashing Attention?
Source | Reformer: The Efficient Transformer |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
LSH Attention, or Locality Sensitive Hashing Attention is a replacement for dot-product attention with one that uses locality-sensitive hashing, changing its complexity from O() to O(), where is the length of the sequence. LSH refers to a family of functions (known as LSH families) to hash data points into buckets so that data points near each other are located in the same buckets with high probability, while data points far from each other are likely to be in different buckets. It was proposed as part of the Reformer architecture.