Learn Before
  • Derivation of the Dot Product for RoPE-Encoded Vectors

A key property of certain positional embeddings is that the dot product between two encoded vectors depends on their relative position. The derivation for this property involves several steps. Arrange the following mathematical expressions to show the correct logical sequence for simplifying the dot product of two 2D vectors, x and y, which have been rotated by angles and respectively. The rotated vectors are given by xR and yR.

0

1

6 months ago

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science

Related
  • A student is simplifying the dot product between two 2D vectors, x and y, which have been encoded with rotational positional information at positions t and s, respectively. The encoded vectors are given by x' = xR and y' = yR, where Rα is a rotation matrix for angle α. The student's derivation for the dot product x'y' is shown below. Identify the step that contains a mathematical error.

    Step 1: x'y' = (xR)(yR)T Step 2: = xR(R)TyT Step 3: = xRRyT Step 4: = xR(t+s)θyT

  • A key property of certain positional embeddings is that the dot product between two encoded vectors depends on their relative position. The derivation for this property involves several steps. Arrange the following mathematical expressions to show the correct logical sequence for simplifying the dot product of two 2D vectors, x and y, which have been rotated by angles and respectively. The rotated vectors are given by xR and yR.

  • Justification of RoPE Dot Product Simplification