What is: GLM?
Source | GLM-130B: An Open Bilingual Pre-trained Model |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
GLM is a bilingual (English and Chinese) pre-trained transformer-based language model that follow the traditional architecture of decoder-only autoregressive language modeling. It leverages autoregressive blank infilling as its training objective.