Learn Before
Analyzing Temperature's Impact on Token Probabilities
A language model is predicting the next token and has calculated the following output scores (logits) for three candidate tokens: 'run': 3.0, 'walk': 2.0, 'jog': 1.0. Explain how setting the temperature parameter (β) to a low value (e.g., 0.5) versus a high value (e.g., 2.0) would affect the final probability distribution for these three tokens. Specifically, which token becomes overwhelmingly probable at the low value, and how do the probabilities of the three tokens compare to each other at the high value?
0
1
Tags
Ch.5 Inference - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
A language model is tasked with completing the sentence: 'The old sea captain stared at the stormy sky and said, 'It's going to be a...'' The model's internal scores (logits) for the next token are highest for 'rough', followed by 'long', 'dark', and then 'whale'. The model generates two different completions using different settings:
- Completion A: '...rough night.'
- Completion B: '...whale of a tale.'
Based on the probability formula , which statement most accurately analyzes the relationship between the temperature parameter () and the generated completions?
Effect of Temperature on Token Generation
Analyzing Temperature's Impact on Token Probabilities