Short Answer

Debugging a Causal Attention Calculation

A developer is implementing an autoregressive model and finds a bug. For a sequence of 4 tokens (indexed 0, 1, 2, 3), the attention output for the token at position 2 is being calculated using the following weighted sum:

Output_at_pos_2 = (0.1 * v_0) + (0.2 * v_1) + (0.6 * v_2) + (0.1 * v_3)

where v_j is the value vector for the token at position j. Identify the specific term in this expression that violates the core principle of this type of attention mechanism and explain why its inclusion is incorrect for an autoregressive task.

0

1

Updated 2025-10-08

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science