True/False

In a system that encodes token positions by rotating their vector representations, the dot product between the encoded vector for a token at position t and another at position s is found to be dependent only on their relative displacement (t-s). Based on this property, the dot product calculated for a pair of tokens at positions 5 and 8 would be identical to the dot product for the same pair of tokens if they were located at positions 15 and 18.

0

1

Updated 2025-10-02

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Application in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science