A Generative Pre-trained Transformer is a language model based on deep learning that generates texts as if they were written by humans. Users 'feed' GPT with a sentence and the transformer generates comprehensible information from data sets that are publicly available. In addition to text, the technology can also process guitar tabs and computer code, for example. The technology is being developed by OpenAI. GPT-2 comprised 1.5 billion parameters. It is assumed that GPT-4 works with 100 trillion parameters.
- - - - -
Generative Pre-trained Transformer (GPT)
-
- Synonyms: GPT, Generative Pre-trained Transformer