Multiple Choice

An engineer modifies a large language model by doubling the number of attention heads per layer while simultaneously halving the dimensionality of each head's key/value vectors. Assuming all other parameters (like the number of layers and sequence length) remain constant, how does this architectural change affect the multi-dimensional structure of the model's key-value (KV) cache?

0

1

Updated 2025-09-28

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science