Learn Before
Appropriate Application of an Attention Mechanism
A machine learning engineer is designing a system that uses a specific type of attention mechanism. In this mechanism, the calculation for any given word in a sentence can only incorporate information from that word and all the words that came before it; it is explicitly prevented from using information from any subsequent words. The engineer is considering using this system for two distinct tasks:
Task A: A real-time translation service that begins translating a sentence as the user is typing it. Task B: A sentiment analysis tool that classifies a completed movie review as positive or negative after the entire review has been submitted.
Analyze the suitability of this specific attention mechanism for each task. For which task is it well-suited, and for which is it poorly-suited? Justify your reasoning for both cases.
0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
In a generative language model, an attention mechanism processes a sequence of 4 tokens. To ensure that the prediction for each token only depends on the preceding tokens and itself, a mask is applied to the raw attention score matrix before the final weighting step. Given the initial score matrix below, where rows represent the 'query' token and columns represent the 'key' token, which of the following matrices correctly shows the result of applying this causal mask? (Note: '-inf' represents a very large negative number that effectively nullifies the score.)
Initial Matrix: [[ 0.8, 1.2, 0.5, 2.1 ], [ 1.5, 0.6, 1.9, 0.3 ], [ 0.9, 2.2, 1.1, 0.7 ], [ 1.3, 0.4, 1.6, 0.2 ]]
Consequences of Misconfigured Attention in Generative Models
Appropriate Application of an Attention Mechanism