Learn Before
Stochastic decoding methods in TGM
Selecting random token based on the probability of previous token. There are two types
- top-k sampling
- top-p (or nucleus) sampling
0
1
Tags
Data Science
Foundations of Large Language Models Course
Computing Sciences
Related
Examples of text generation
Decoding Methods to Generate Continuations in TGM
Stochastic decoding methods in TGM
Simultaneous Processing of Input Context Tokens
Building the Encoded Representation of Input
A user gives a language model the input: "Ancient Rome was a civilization known for its". The model then produces the following output: "engineering marvels, such as aqueducts and roads." Based on the two-stage process of text generation, which statement best analyzes this interaction?
Arrange the following stages into the correct sequence that describes how a language model generates text based on an initial input.
Analyzing a Code Generation Scenario
Learn After
Top-k Sampling
Top-p (Nucleus) Sampling
A team developing a language model for creative storytelling finds that its generated text is often repetitive and predictable, frequently getting stuck in loops (e.g., 'I am I am I am...'). Which of the following decoding strategies would be most effective at addressing this issue by introducing more variety into the generated text?
Analyzing Text Generation Outputs
Comparing Text Generation Strategies
When using a stochastic decoding method for text generation, the model is guaranteed to select the single token with the highest probability at each step.