What is: ELMo?
Source | Deep contextualized word representations |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Embeddings from Language Models, or ELMo, is a type of deep contextualized word representation that models both (1) complex characteristics of word use (e.g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i.e., to model polysemy). Word vectors are learned functions of the internal states of a deep bidirectional language model (biLM), which is pre-trained on a large text corpus.
A biLM combines both a forward and backward LM. ELMo jointly maximizes the log likelihood of the forward and backward directions. To add ELMo to a supervised model, we freeze the weights of the biLM and then concatenate the ELMo vector with and pass the ELMO enhanced representation into the task RNN. Here is a context-independent token representation for each token position.
Image Source: here