Essay

Analyzing Contextual Influence on Next-Token Probability

Consider an autoregressive language model that predicts the next token in a sequence. You are given two different preceding sequences (contexts):

Context A: "The chef carefully seasoned the soup. He reached for the final ingredient, a pinch of" Context B: "The mountain climber checked his gear. He reached for the final piece of equipment, a length of"

For the potential next token 'rope', analyze which context (A or B) would cause the model to assign a higher conditional probability to this token. Justify your reasoning by explaining how the information in the preceding tokens of each context informs the model's prediction.

0

1

Updated 2025-10-08

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Ch.2 Generative Models - Foundations of Large Language Models

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science