Set of Sequential Key-Value Pairs
This represents a collection of key-value vector pairs for all positions up to and including index i within a sequence. The notation illustrates this set, where \tau is the total length of the sequence. This structure is fundamental in attention mechanisms, particularly in autoregressive decoding, where it's used to cache past key-value states for efficient computation of subsequent steps.

0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Query (Attention)
Key (Attention)
Value (Attention)
State Function from Previous Outputs
Value Weight Matrix Formula
Set of Sequential Key-Value Pairs
Query Vector
Key Vector
Value Vector
Implicit Relative Position Modeling in Self-Attention with RoPE
Value Weight Matrix Definition ()
Imagine a system translating the sentence 'The quick brown fox jumps'. When the system is generating the output word corresponding to 'jumps', it needs to determine which words in the input sentence are most relevant. To do this, a vector representing the current translation context (i.e., 'what information do I need to produce the next word?') is compared against a set of searchable 'label' vectors, one for each word in the input sentence. This comparison generates a relevance score for each input word. Finally, a new vector is created by taking a weighted average of the 'content' vectors of the input words, using the relevance scores as weights. How do the three main vector types in this process correspond to their roles?
In a system designed to answer questions based on a provided document, the model first creates a representation of the user's question. It then compares this representation against a set of searchable representations, one for each sentence in the document, to determine relevance scores. Finally, it constructs an answer by creating a weighted combination of the informational content from each sentence, using the relevance scores as weights. Which option correctly assigns the roles of Query, Key, and Value vectors in this scenario?
Context Window of Key Vectors Notation
Key-Value Cache
In a computational mechanism designed to selectively focus on different parts of an input sequence, information is represented by three distinct types of vectors that interact to produce a context-aware output. Match each vector type to its specific role in this process.
Masked QKV Attention Formula
Set of Sequential Key-Value Pairs
Let a sequence of vectors be constructed where the first element is and the second element is . The third element has multiple potential versions, and the 5th version is given as . According to the notational definition , what is the specific sequence represented by when using the 5th version of the 3rd element?
Key Matrix for Causal Attention (K_≤i)
Deconstructing Vector Prefix Notation
Key-Value Cache
Consider a sequence of vectors represented as . The notation represents the subsequence containing only the first two vectors, .
Learn After
Set of Indexed Key-Value Pairs
Set of Superscript-Indexed Vectors
Set of Key-Value Pairs
Function of a Sequence of Overlined Variables
Function of a Sequence of Averaged Vectors
Vector Slice Notation for a Sequence Window ()
Set of Sequential Vectors Notation
Vector Sequence Window Notation
Consider an autoregressive model generating a sequence of tokens one by one. At each step
i, the model calculates attention using the query from the current token and the keys and values from all tokens generated so far (from position 1 toi). To optimize this process, the model maintains a growing set of all previously computed key and value vectors. What is the primary computational advantage of this strategy?State of an Autoregressive Cache
An autoregressive language model with
τparallel computational units (e.g., attention heads) is generating a sequence of tokens. After computing the output for the 3rd token, the model stores the key and value vectors from all tokens processed so far to use in subsequent steps. Which of the following notations correctly represents the complete set of these stored key-value pairs at this specific moment?