Concept

GPT-3

GPT-3 is a generative pre-trained transformer model introduced in 2020. Its largest version, GPT3-175B, contains 175 billion parameters and required 0.5 trillion tokens of pre-training data sourced from webpages, books, and Wikipedia.

0

1

Updated 2026-04-21

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Related