Short Answer

Contextual Influence on Token Probability

An autoregressive language model is processing the two partial sentences below:

A: 'The chef carefully seasoned the soup with a pinch of...' B: 'The astronomer carefully adjusted the telescope with a turn of...'

For which sentence, A or B, would the model assign a higher conditional probability to the next token being 'salt'? Explain your reasoning by describing how the preceding tokens influence this calculation.

0

1

Updated 2025-10-04

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Ch.2 Generative Models - Foundations of Large Language Models

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science