Calculating Attention Output
An attention mechanism is processing a sequence of three items. After comparing a query to the keys of these items, the resulting attention scores (weights) are [0.2, 0.5, 0.3]. The corresponding information-carrying vectors for the three items are V1 = [10, 2], V2 = [4, 8], and V3 = [0, 20]. Calculate the final output vector produced by this attention layer.
0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
In a simplified attention mechanism processing an input sequence, the attention scores for a particular output are calculated as [0.1, 0.8, 0.1] for the three input items respectively. If the information-carrying vector for the second input item (the one with the 0.8 score) was replaced with a zero vector (a vector containing only zeros), what would be the most direct consequence for the output of the attention layer?
Analyzing Unexpected Attention Output
Calculating Attention Output