Viet-Anh on Software Logo

What is: Primer?

SourcePrimer: Searching for Efficient Transformers for Language Modeling
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Primer is a Transformer-based architecture that improves upon the Transformer architecture with two improvements found through neural architecture search: squared RELU activations in the feedforward block, and depthwise convolutions added to the attention multi-head projections: resulting in a new module called Multi-DConv-Head-Attention.