Learn Before
The Role of Input Context in Model Prediction Quality
A large language model is tasked with summarizing a complex legal document. Analyze why providing the entire, unmodified document as input might not produce an optimal summary. Then, explain the general principle by which the input could be augmented with additional, carefully selected information to improve the quality of the generated summary, without altering the model's internal parameters.
0
1
Tags
Ch.5 Inference - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Few-Shot Learning in Prompting
Chain-of-Thought (COT) Prompting
Strategic Information Management in Context Scaling
A developer is using a large language model to classify customer feedback. The model is struggling with ambiguous statements. For the input 'The setup process was a bit of a journey,' the model inconsistently provides different classifications. Which of the following revised inputs best demonstrates the principle of improving performance by extending the model's context with helpful prior information?
Optimizing a Creative Writing Assistant
The Role of Input Context in Model Prediction Quality
Context Scaling via Dynamic External Knowledge