Learn Before
Example

Example of Causal Language Modeling

In causal language modeling, tokens are sequentially generated following their natural text order, without any initial source-side context. For example, to generate a sequence autonomously, the model produces the output tokens one by one: The1^1 kitten2^2 is3^3 chasing4^4 the5^5 ball6^6 .7^7. The superscripts indicate the strict left-to-right autoregressive generation order on the target side.

0

1

Updated 2026-04-17

Contributors are:

Who are from:

Tags

Foundations of Large Language Models

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Related