The application of a rotary positional embedding to a two-dimensional vector at position with frequency results in a new vector , where the components are calculated as:
Based on this structure, which statement best analyzes how the output vector's components are formed?
0
1
Tags
Ch.3 Prompting - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Definition of the RoPE Cosine Vector
Definition of the RoPE Sine Vector
Periodicity of RoPE's Sine and Cosine Components
The application of a rotary positional embedding to a two-dimensional vector at position with frequency results in a new vector , where the components are calculated as:
Based on this structure, which statement best analyzes how the output vector's components are formed?
A positional encoding method transforms a two-dimensional vector at position into a new vector using the equations:
This transformation is considered non-linear with respect to the input vector because it involves trigonometric functions.
Matrix Representation of a 2D Rotary Transformation