Viet-Anh on Software Logo

What is: GPT?

SourceImproving Language Understanding by Generative Pre-Training
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

GPT is a Transformer-based architecture and training procedure for natural language processing tasks. Training follows a two-stage procedure. First, a language modeling objective is used on the unlabeled data to learn the initial parameters of a neural network model. Subsequently, these parameters are adapted to a target task using the corresponding supervised objective.