Concept

Inner Product of RoPE-Encoded Token Representations

To analyze how Rotary Positional Embeddings (RoPE) encode relative positional information, the inner product is calculated between the rotated representations of tokens at different positions, such as tt and ss. This operation is fundamental to demonstrating that the relationship between two tokens in RoPE depends only on their relative positions.

Image 0

0

1

Updated 2026-04-29

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences