Learn Before
Concept

Role of Positional Embeddings in Order-Insensitive Models

In models like the Transformer, token embeddings such as xix_i are typically position-independent, meaning they do not inherently contain information about their location in a sequence. This makes the model's processing order-insensitive. To provide this crucial positional context, positional embeddings, PE(i)PE(i), are introduced. These are added to their corresponding token embeddings, allowing the model to distinguish between identical tokens at different positions.

0

1

Updated 2026-04-23

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences