Learn Before
Effect of Temperature on Token Generation
A language model is trying to complete the sentence 'The cat sat on the ___.' It has calculated the following output scores for potential next words: {'mat': 4.0, 'rug': 3.5, 'throne': 1.0, 'car': -2.0}. The model's output probabilities are determined by the formula: , where is a temperature parameter. Consider two scenarios for generating the next word: Scenario A with and Scenario B with . In which scenario is the model more likely to generate the word 'throne'? Justify your answer by explaining the role of the temperature parameter in shaping the final probability distribution.
0
1
Tags
Ch.5 Inference - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
A language model is tasked with completing the sentence: 'The old sea captain stared at the stormy sky and said, 'It's going to be a...'' The model's internal scores (logits) for the next token are highest for 'rough', followed by 'long', 'dark', and then 'whale'. The model generates two different completions using different settings:
- Completion A: '...rough night.'
- Completion B: '...whale of a tale.'
Based on the probability formula , which statement most accurately analyzes the relationship between the temperature parameter () and the generated completions?
Effect of Temperature on Token Generation
Analyzing Temperature's Impact on Token Probabilities