Formula

Derivation of the Dot Product for RoPE-Encoded Vectors

The dot product of two RoPE-encoded vectors in 2D Euclidean space, one at position tt and another at position ss, can be simplified to demonstrate its dependency on the relative position (ts)(t-s). The derivation uses the matrix representation of the rotation and properties of the transpose operation: Ro(x,tθ)[Ro(y,sθ)]T=xRtθ[yRsθ]T=xRtθ[Rsθ]TyT=xR(ts)θyT\mathrm{Ro}(\mathbf{x}, t\theta)[\mathrm{Ro}(\mathbf{y}, s\theta)]^\mathrm{T} = \mathbf{x}R_{t\theta}[\mathbf{y}R_{s\theta}]^\mathrm{T} = \mathbf{x}R_{t\theta}[R_{s\theta}]^\mathrm{T}\mathbf{y}^\mathrm{T} = \mathbf{x}R_{(t-s)\theta}\mathbf{y}^\mathrm{T} This result shows that the dot product is equivalent to rotating vector x\mathbf{x} by the relative angle (ts)θ(t-s)\theta and then taking its dot product with vector y\mathbf{y}.

Image 0

0

1

Updated 2026-04-29

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences