Learn Before
Improving LLM Summarization Quality
Given the following scenario, identify the most direct adjustment to the decoding process to solve the described problem and explain your reasoning.
0
1
Tags
Ch.5 Inference - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Critique of Generation Length Strategy
Improving LLM Summarization Quality
A team is refining a language model's story-generation capabilities. Their primary strategy is to increase the maximum number of tokens the model can produce in a single output, aiming for more comprehensive and detailed narratives. What is the most significant potential downside the team should anticipate as a direct result of only extending the generation length?
Generating and Verifying Thinking Paths