Learn Before
  • Positional Representations of Transformers

Rotary Positional Embeddings

Similar to sinusoidal embeddings, rotary positional embeddings (RoPE) utilize fixed, hard-coded values to represent positions. However, instead of adding positional vectors to token embeddings, RoPE models positional context by rotating the token embeddings in a complex vector space. This results in a multiplicative integration of positional information, distinguishing it from the additive approach common in other methods.

Image 0

0

1

8 days ago

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Related
  • Relative Positional Representations

  • Implicit Positional Representations

  • Other representations of positional information in transformers

  • Learnable Absolute Positional Embeddings

  • Rotary Positional Embeddings

Learn After
  • Comparison of Rotary and Sinusoidal Embeddings

  • Conceptual Illustration of RoPE's Rotational Mechanism

  • Example of RoPE Capturing Relative Positional Information

  • Application of RoPE to d-dimensional Embeddings

  • Application of RoPE to Token Embeddings

  • RoPE as a Linear Combination of Periodic Functions