What is: AutoTinyBERT?
Source | AutoTinyBERT: Automatic Hyper-parameter Optimization for Efficient Pre-trained Language Models |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
AutoTinyBERT is a an efficient BERT variant found through neural architecture search. Specifically, one-shot learning is used to obtain a big Super Pretrained Language Model (SuperPLM), where the objectives of pre-training or task-agnostic BERT distillation are used. Then, given a specific latency constraint, an evolutionary algorithm is run on the SuperPLM to search optimal architectures. Finally, we extract the corresponding sub-models based on the optimal architectures and further train these models.