Short Answer

Derivation of Quadratic Complexity in Autoregressive Attention

When an autoregressive model generates a sequence of text, the computational cost for the self-attention mechanism at each step is linear relative to the number of tokens already generated. However, the total computational cost for generating the entire sequence grows quadratically with the final sequence length. Explain precisely why this is the case, and how the number of layers in the model affects this total cost.

0

1

Updated 2025-10-04

Contributors are:

Who are from:

Tags

Ch.5 Inference - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science