True/False

In a self-attention mechanism, the output for a specific token is calculated as a weighted sum of 'value' vectors from all tokens in the sequence. If the attention weight connecting a query token to a specific value token is exactly zero, that value token has no contribution to the final output for the query token.

0

1

Updated 2025-10-08

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science