An auto-regressive language model using a greedy decision rule has generated the sequence 'The cat sat' with a cumulative probability of 0.2. At the next step, the model calculates the following conditional probabilities for the next token: P('on' | 'The cat sat') = 0.6, P('by' | 'The cat sat') = 0.3, and P('under' | 'The cat sat') = 0.1. What will be the newly generated sequence and its updated cumulative probability?
0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
An auto-regressive language model using a greedy decision rule has generated the sequence 'The cat sat' with a cumulative probability of 0.2. At the next step, the model calculates the following conditional probabilities for the next token: P('on' | 'The cat sat') = 0.6, P('by' | 'The cat sat') = 0.3, and P('under' | 'The cat sat') = 0.1. What will be the newly generated sequence and its updated cumulative probability?
A language model is tasked with generating the three-token sequence 'The quick brown' using a greedy, auto-regressive approach. Arrange the following actions in the correct chronological order that the model would take.
Calculating Conditional Probability in a Generation Step