Learn Before
In an autoregressive text generation process, the sequence generated up to a certain point is The dog chased the. At the current step, the model generates and selects the token ball. What is the new, extended sequence that will be used as the basis for generating the subsequent token?
0
1
Tags
Ch.5 Inference - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Candidate Set in Sampling-Based Decoding
In an autoregressive text generation process, the sequence generated up to a certain point is
The dog chased the. At the current step, the model generates and selects the tokenball. What is the new, extended sequence that will be used as the basis for generating the subsequent token?An autoregressive model is generating a sequence. It begins with the single token
y_1= 'The'. In the next step, it samples the tokenȳ_2= 'cat'. Following that, it samples the tokenȳ_3= 'sat'. What is the resulting sequence that is formed after these two sampling steps?Formal Representation of Sequence Extension