Learn Before
Deconstructing a Model's Generated Text
A language model is given an incomplete sentence and generates a completion. Based on the formal structure of a model's output, how would you deconstruct the generated text into its fundamental components? Describe the overall structure and identify what the first few components would likely be.
0
1
Tags
Ch.5 Inference - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Formal Definition of LLM Inference
Notation for Preceding Output Subsequence
Deconstructing a Model's Generated Text
Representing Model Output as a Token Sequence
A Large Language Model generates the sentence: 'AI is transforming our world.' How is this output fundamentally structured by the model before being presented to the user?
Separating Input and Output Variables in LLM Formulation