Natural Language ProcessingSemantics & Representation

GPT

Overview

Generative Pre-trained Transformer — a family of autoregressive language models that generate text by predicting the next token.

Cross-References(2)

Deep Learning
Blockchain & DLT

More in Natural Language Processing

See Also