Learn Before
Example

Example of RoPE Capturing Relative Positional Information

Rotary Positional Embeddings (RoPE) are designed to capture the relative positions of tokens, a concept illustrated by the word pair 'cat' and 'sleeping' appearing in different sentences. In the first sentence, 'The₁ cat₂ is₃ sleeping₄ peacefully₅...', 'cat' is at position 2 and 'sleeping' is at position 4. In the second sentence, '...the₈ cat₉ is₁₀ sleeping₁₁ on₁₂...', the words are at positions 9 and 11. Although their absolute positions have changed, the relative distance between them remains 2. RoPE's rotational mechanism ensures that the angular relationship between the vector embeddings for 'cat' and 'sleeping' is determined solely by this constant relative distance, allowing the model to generalize relationships irrespective of their absolute location in a sequence.

0

1

Updated 2025-10-10

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Related