Relation

Positional Representations of Transformers

These are mechanisms that inject positional information into Transformers. Positional encodings are important within the transformer architecture because the ordering of words matters when modeling sequences of text.

Types of Positional Representations:

  • absolute positional representations
  • relative position representations
  • implicit representations
  • other representations

0

1

Updated 2025-09-01

Contributors are:

Who are from:

Tags

Data Science