True/False

In a simple self-attention mechanism where similarity is measured by dot product and weights are normalized by a softmax function, if a current input vector x_i is perfectly orthogonal to a preceding input vector x_j, then x_j will have zero influence on the final output vector y_i.

0

1

Updated 2025-10-02

Contributors are:

Who are from:

Tags

Data Science

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science