Formula

Formula for RoPE-Encoded Token Embedding

The final Rotary Positional Embedding (RoPE) for a token embedding xi at position i is denoted as ei. It is calculated by applying the RoPE transformation, Ro, to the original embedding xi using the position i and the set of frequency parameters θ. The formula is: ei=Ro(xi,iθ)e_i = \mathrm{Ro}(\mathbf{x}_i, i\theta)

0

1

Updated 2025-10-08

Contributors are:

Who are from:

Tags

Ch.3 Prompting - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences