Viet-Anh on Software Logo

What is: Low-Rank Factorization-based Multi-Head Attention?

SourceLow Rank Factorization for Compact Multi-Head Self-Attention
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Low-Rank Factorization-based Multi-head Attention Mechanism, or LAMA, is a type of attention module that uses low-rank factorization to reduce computational complexity. It uses low-rank bilinear pooling to construct a structured sentence representation that attends to multiple aspects of a sentence.