Short Answer

Invariance in Rotational Position Encodings

A language model uses a rotational scheme to encode token positions. An analyst observes that the inner product between the encoded representations of the word 'apple' at position 4 and 'banana' at position 9 is identical to the inner product between the encoded representations of 'apple' at position 21 and 'banana' at position 26. Explain the mathematical reason for this observation.

0

1

Updated 2025-10-02

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science