Learn Before
A team developing a language model for creative storytelling finds that its generated text is often repetitive and predictable, frequently getting stuck in loops (e.g., 'I am I am I am...'). Which of the following decoding strategies would be most effective at addressing this issue by introducing more variety into the generated text?
0
1
Tags
Data Science
Foundations of Large Language Models Course
Computing Sciences
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Top-k Sampling
Top-p (Nucleus) Sampling
A team developing a language model for creative storytelling finds that its generated text is often repetitive and predictable, frequently getting stuck in loops (e.g., 'I am I am I am...'). Which of the following decoding strategies would be most effective at addressing this issue by introducing more variety into the generated text?
Analyzing Text Generation Outputs
Comparing Text Generation Strategies
When using a stochastic decoding method for text generation, the model is guaranteed to select the single token with the highest probability at each step.