Concept

Word embedding

Rather than representing words as discrete variables, word embeddings map words into low-dimensional real-valued vectors. This continuous representation space makes it possible to compute the meanings of words and word nn-grams. As a result of this distributed representation, language models are no longer burdened with the curse of dimensionality, allowing them to represent exponentially many nn-grams via a compact and dense neural model.

0

1

Updated 2026-05-02

Tags

Data Science

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Related