Learn Before
Context Window of Key Vectors Notation
The notation represents a set of key vectors, denoted by the bold letter . This set comprises all key vectors within a specific context window, starting from the index and ending at the index . The size of this context window is determined by .

0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Computing Sciences
Related
Query (Attention)
Key (Attention)
Value (Attention)
State Function from Previous Outputs
Value Weight Matrix Formula
Set of Sequential Key-Value Pairs
Query Vector
Key Vector
Value Vector
Implicit Relative Position Modeling in Self-Attention with RoPE
Value Weight Matrix Definition ()
Imagine a system translating the sentence 'The quick brown fox jumps'. When the system is generating the output word corresponding to 'jumps', it needs to determine which words in the input sentence are most relevant. To do this, a vector representing the current translation context (i.e., 'what information do I need to produce the next word?') is compared against a set of searchable 'label' vectors, one for each word in the input sentence. This comparison generates a relevance score for each input word. Finally, a new vector is created by taking a weighted average of the 'content' vectors of the input words, using the relevance scores as weights. How do the three main vector types in this process correspond to their roles?
In a system designed to answer questions based on a provided document, the model first creates a representation of the user's question. It then compares this representation against a set of searchable representations, one for each sentence in the document, to determine relevance scores. Finally, it constructs an answer by creating a weighted combination of the informational content from each sentence, using the relevance scores as weights. Which option correctly assigns the roles of Query, Key, and Value vectors in this scenario?
Context Window of Key Vectors Notation
Key-Value Cache
In a computational mechanism designed to selectively focus on different parts of an input sequence, information is represented by three distinct types of vectors that interact to produce a context-aware output. Match each vector type to its specific role in this process.
Masked QKV Attention Formula
Learn After
An attention mechanism is processing a sequence of inputs. At the current position, indexed as
i=8, the model needs to attend to a context window of key vectors from the lastn_c=5positions (including the current position). Which of the following notations correctly represents this specific set of key vectors?A model is processing a sequence and is currently at position
i=10. It is using a context window of sizen_c=7. According to the notation { \mathbf{k}_{i-n_c+1}, \dots, \mathbf{k}_i }, the first key vector in this specific context window would be at index ____.Determining Context Window Parameters