Generative Pre-trained Transformer (GPT)

What is Generative Pre-trained Transformer (GPT)?

GPT refers to a series of large language models developed by OpenAI that use the transformer architecture. They are 'generative' because they create new text, 'pre-trained' on a vast corpus of unlabeled data, and 'transformers' based on their underlying neural network structure.

Where did the term "Generative Pre-trained Transformer (GPT)" come from?

Introduced by OpenAI in 2018 with the paper 'Improving Language Understanding by Generative Pre-Training'.

How is "Generative Pre-trained Transformer (GPT)" used today?

GPT-3 and subsequently ChatGPT popularized the term globally, making 'GPT' synonymous with advanced AI for the general public.

Related Terms