Viet-Anh on Software Logo

What is: AutoTinyBERT?

SourceAutoTinyBERT: Automatic Hyper-parameter Optimization for Efficient Pre-trained Language Models
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

AutoTinyBERT is a an efficient BERT variant found through neural architecture search. Specifically, one-shot learning is used to obtain a big Super Pretrained Language Model (SuperPLM), where the objectives of pre-training or task-agnostic BERT distillation are used. Then, given a specific latency constraint, an evolutionary algorithm is run on the SuperPLM to search optimal architectures. Finally, we extract the corresponding sub-models based on the optimal architectures and further train these models.