Multiple Choice

A language model generates text by calculating the probability of the next word given all the preceding words. Consider the following two contexts:

Context A: 'The chef carefully seasoned the soup. He tasted it and decided it needed more' Context B: 'The comedian carefully timed the joke. He tested it and decided it needed more'

Which statement best analyzes the likely probability the model would assign to the word 'salt' as the very next word?

0

1

Updated 2025-10-08

Contributors are:

Who are from:

Tags

Ch.4 Alignment - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science