Learn Before
RoPE 2D Vector Rotation Formula
The Rotary Positional Embedding (RoPE) for a 2-dimensional vector is defined as a rotation by an angle . This is achieved by multiplying the vector with a 2x2 rotation matrix . The operation and its result are given by:

0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Application of RoPE Rotation to a 2D Vector
RoPE Frequency Parameters
Definition of the 2x2 RoPE Rotation Matrix Block
RoPE Parameter Vector Definition
Definition of RoPE Parameter Vector (θ)
A language model encodes token positions by applying a unique, position-dependent rotational transformation to each token's initial embedding. The final, position-aware embedding for a token is the result of this transformation. If the exact same token (e.g., 'model') appears at position 4 and later at position 12 in a sequence, which statement best describes the relationship between their final embeddings, and ?
RoPE 2D Vector Rotation Formula
Formula for RoPE-Encoded Token Embedding
Uniqueness of RoPE-based Embeddings
Debugging a RoPE Implementation
Learn After
A 2-dimensional vector is given by . Calculate the resulting vector after applying a rotational transformation with an angle using the following formula:
Result =
Analysis of a Vector Transformation Property
Consider the transformation applied to a 2-dimensional vector by an angle , defined by the formula:
This transformation alters the magnitude (length) of the vector .