Short Answer

The Role of Global Tokens in Mitigating 'Spiky' Attention

An AI researcher observes that a model processing very long text sequences has highly 'spiky' attention weight distributions, where a few tokens receive almost all the attention, leading to unstable training. Explain precisely how introducing global tokens would address this issue by altering the output of the function used to calculate these attention weights.

0

1

Updated 2025-10-07

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science