Concept

Semantic Preservation in Rotary Embeddings

In Rotary Positional Embeddings (RoPE), positional context is injected into a token embedding by rotating its vector representation. Because this rotational transformation only changes the vector's direction and preserves its magnitude (length), the original semantic meaning of the token is retained. The position is modeled entirely by the new angle, allowing the embedding to simultaneously encode both the token's identity and its sequence position.

0

1

Updated 2026-04-29

Contributors are:

Who are from:

Tags

Foundations of Large Language Models

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences