Concept

Modeling Arbitrarily Long Sequences with ALiBi

The Attention with Linear Biases (ALiBi) mechanism functions by adding a fixed scalar penalty to the query-key product (qikjT\mathbf{q}_i \mathbf{k}_{j}^{\mathrm{T}}) for each incremental step the key position (jj) moves away from the query position (ii). By relying on this consistent, step-wise penalty rather than a predetermined length limit, the model does not need to adapt to a restricted range of sequence lengths and can be seamlessly employed to process arbitrarily long sequences.

0

1

Updated 2026-04-24

Contributors are:

Who are from:

Tags

Foundations of Large Language Models

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences