A key property of certain positional embeddings is that the dot product between two encoded vectors depends on their relative position. The derivation for this property involves several steps. Arrange the following mathematical expressions to show the correct logical sequence for simplifying the dot product of two 2D vectors, x and y, which have been rotated by angles tθ and sθ respectively. The rotated vectors are given by xRtθ and yRsθ.
0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
A student is simplifying the dot product between two 2D vectors, x and y, which have been encoded with rotational positional information at positions t and s, respectively. The encoded vectors are given by x' = xRtθ and y' = yRsθ, where Rα is a rotation matrix for angle α. The student's derivation for the dot product x' ⋅ y' is shown below. Identify the step that contains a mathematical error.
Step 1: x' ⋅ y' = (xRtθ)(yRsθ)T Step 2: = xRtθ(Rsθ)TyT Step 3: = xRtθRsθyT Step 4: = xR(t+s)θyT
A key property of certain positional embeddings is that the dot product between two encoded vectors depends on their relative position. The derivation for this property involves several steps. Arrange the following mathematical expressions to show the correct logical sequence for simplifying the dot product of two 2D vectors, x and y, which have been rotated by angles tθ and sθ respectively. The rotated vectors are given by xRtθ and yRsθ.
Justification of RoPE Dot Product Simplification