Learn Before
Case Study

Addressing Repetitive Model Outputs

A developer is using a language model with a search-based decoding strategy to generate multiple creative story ideas from a single prompt. They observe that the generated outputs, while grammatically correct, are often minor variations of the same core concept, lacking true creative divergence. For example, for the prompt 'An astronaut makes a startling discovery on a new planet,' the model consistently produces outputs centered around finding simple alien life forms. How could the developer modify the decoding process to address this issue and produce a more varied set of story ideas? Explain the reasoning behind your proposed solution.

0

1

Updated 2025-10-03

Contributors are:

Who are from:

Tags

Ch.5 Inference - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Application in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science