Concept

Convergence Phase of LLM Scaling (Irreducible Error)

After a period of rapid improvement, the rate of error reduction slows as the model enters the convergence phase. The performance curve flattens and approaches a lower bound known as the 'irreducible error.' This floor on performance may be caused by factors such as inherent noise in the dataset, fundamental ambiguity in the language tasks, or the limitations of the model's architecture.

Image 0

0

1

Updated 2026-05-02

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences