What is: GloVe Embeddings?
Source | GloVe: Global Vectors for Word Representation |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
GloVe Embeddings are a type of word embedding that encode the co-occurrence probability ratio between two words as vector differences. GloVe uses a weighted least squares objective that minimizes the difference between the dot product of the vectors of two words and the logarithm of their number of co-occurrences:
where and are the word vector and bias respectively of word , and are the context word vector and bias respectively of word , is the number of times word occurs in the context of word , and is a weighting function that assigns lower weights to rare and frequent co-occurrences.