Learn Before
Candidate Sequence in a Word-by-Word Generation
A text generation model operates by selecting the single most probable word at each step to extend its current output. If the model has generated the partial sentence 'The sun shines brightly in the', how many distinct, partially-completed sentences will the model consider as candidates to build upon in the very next step? Justify your answer.
0
1
Tags
Ch.5 Inference - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Formula for the Candidate Set in Greedy Search
A text generation model using a greedy decoding approach has produced the sequence 'The dog chased'. At the current step, it calculates the probabilities for the next token as follows: 'the' (0.7), 'a' (0.2), 'its' (0.08), and 'his' (0.02). Based on this information, what constitutes the set of candidate sequences that will be considered for the next step of the generation process?
When generating text by iteratively appending the single most probable token at each step, the collection of partial sequences under consideration for the next step always includes multiple high-probability options.
Candidate Sequence in a Word-by-Word Generation