Learn Before
Multiple Choice

A language model is given the context: 'The chef carefully added the final, crucial ingredient to the simmering stew: a pinch of...'. The model must predict the next word. Below are the conditional probabilities, Pr(next_word | context), calculated by two different models for four possible next words.

Next WordModel A ProbabilityModel B Probability
salt0.650.20
concrete0.020.45
laughter0.030.15
thyme0.300.20

Based on this data, which of the following statements is the most accurate analysis of the models' understanding of the context?

0

1

Updated 2025-09-28

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Ch.5 Inference - Foundations of Large Language Models

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science