What is: MobileBERT?
Source | MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
MobileBERT is a type of inverted-bottleneck BERT that compresses and accelerates the popular BERT model. MobileBERT is a thin version of BERT_LARGE, while equipped with bottleneck structures and a carefully designed balance between self-attentions and feed-forward networks. To train MobileBERT, we first train a specially designed teacher model, an inverted-bottleneck incorporated BERT_LARGE model. Then, we conduct knowledge transfer from this teacher to MobileBERT. Like the original BERT, MobileBERT is task-agnostic, that is, it can be generically applied to various downstream NLP tasks via simple fine-tuning. It is trained by layer-to-layer imitating the inverted bottleneck BERT.