Learn Before
Comparison

Comparison of Prefix and Causal Language Modeling

Prefix Language Modeling (PrefixLM) and Causal Language Modeling (CLM) differ in how they process and generate text sequences. In CLM, the entire sequence is generated autoregressively, with each token being predicted based on all preceding tokens starting from the very first one. In contrast, PrefixLM uses a bidirectional encoder to process an initial prefix sequence all at once, creating a rich contextual representation. A decoder then autoregressively generates the remainder of the sequence, conditioned on this encoded prefix.

0

1

Updated 2026-04-16

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences