Learn Before
Conditional Probability Formula for Sequence Generation
This formula, expressed as , calculates the probability of a particular element occurring next in a sequence, given the preceding context. This context includes the initial input and any elements that have already been generated.
0
1
Tags
Ch.4 Alignment - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Examples of Requirement-Based Text Generation
Example of Constrained Text Generation: Chinese Poem
A user provides the following instructions to a text generation model: 'Write a three-sentence summary of a company's quarterly earnings report. The tone must be formal and objective, and it must include the exact phrase "year-over-year growth".' Which of the following outputs best fulfills all the user's requirements?
Evaluating Generated Text Against User Requirements
Conditional Probability Formula for Sequence Generation
Formulating Effective Text Generation Instructions
Learn After
A text generation model is tasked with completing the sentence: 'The mountain climber reached the summit and felt a sense of'. The model calculates the probability of several potential next words based on the preceding text. Given the following calculated probabilities, which word will the model select to continue the sequence?
Analyzing Repetitive Model Output
A language model generates text by calculating the probability of the next word given all the preceding words. Consider the following two contexts:
Context A: 'The chef carefully seasoned the soup. He tasted it and decided it needed more' Context B: 'The comedian carefully timed the joke. He tested it and decided it needed more'
Which statement best analyzes the likely probability the model would assign to the word 'salt' as the very next word?