Learn Before
Concept

Curse of Dimensionality in Traditional Language Models

The 'curse of dimensionality' in the context of traditional language models, such as n-gram models, refers to the challenge posed by representing words as discrete, individual units. This approach leads to an extremely high-dimensional and sparse feature space, as the number of possible word sequences grows exponentially with vocabulary size and context length. This sparsity makes it difficult for models to generalize from the training data and effectively estimate probabilities for unseen n-grams.

0

1

Updated 2026-04-18

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences