Learn Before
During the very first step of generating the next word for a given text, a language model produces a candidate list that includes every single token from its entire vocabulary.
0
1
Tags
Ch.5 Inference - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Comprehension in Revised Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
A language model is in the process of generating the next word after the context 'The ocean is'. The model's entire vocabulary is limited to these five words: ['deep', 'blue', 'cold', 'vast', 'empty']. In the very first step of its decision-making process, the model must generate a list of all possible candidates for the next word. What does this initial list of candidates look like?
During the very first step of generating the next word for a given text, a language model produces a candidate list that includes every single token from its entire vocabulary.
A developer is troubleshooting a text generation system. They provide the input context 'The sun is shining and the sky is'. In the very first step of generating the next word, the system produces a candidate list containing only {'blue', 'clear', 'bright'}. The system's full vocabulary, however, contains over 10,000 words. Based on this observation, which fundamental principle of this initial generation stage has been incorrectly implemented?