Learn Before
Rotary Positional Embeddings
Application of RoPE to Token Embeddings
The final embedding for a token at position , denoted as , is obtained by applying the Rotary Positional Embedding (RoPE) transformation to the token's original embedding . This is represented by the function , where is the position and represents the rotational frequency parameters. The formula is:

0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Ch.3 Prompting - Foundations of Large Language Models
Related
Comparison of Rotary and Sinusoidal Embeddings
Conceptual Illustration of RoPE's Rotational Mechanism
Example of RoPE Capturing Relative Positional Information
Application of RoPE to d-dimensional Embeddings
Application of RoPE to Token Embeddings
RoPE as a Linear Combination of Periodic Functions
Learn After
Application of RoPE Rotation to a 2D Vector
RoPE Frequency Parameters
Definition of the 2x2 RoPE Rotation Matrix Block
RoPE Parameter Vector Definition