Multiple Choice

A Transformer-based language model is given the prompt 'The quick brown fox' and begins generating a continuation. It has already produced the tokens 'jumps', 'over'. The model is now at the step of generating the next token after 'over'. During the self-attention calculation at this specific step, which set of tokens provides the source for the keys and values that the current token's query will attend to?

0

1

Updated 2025-09-28

Contributors are:

Who are from:

Tags

Ch.5 Inference - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science