Learn Before
Autoregressive Conditional Probability
In sequential modeling, autoregressive conditional probability refers to the likelihood of a specific element, (x_i), occurring in a sequence, given all the elements that appeared before it, from (x_0) to (x_{i-1}). This concept is formally expressed as the conditional probability (Pr(x_i|x_0, ..., x_{i-1})).

0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Chain Rule
Autoregressive Conditional Probability
General Notation for Conditional Probability Models
Prediction via Optimization
A language model is analyzing a text corpus of 10,000 two-word phrases. The analysis reveals the following counts:
- The word 'deep' is the first word in 400 phrases.
- The word 'learning' is the second word in 250 phrases.
- The specific phrase 'deep learning' occurs 80 times.
Based on this data, what is the probability that the second word of a phrase is 'learning', given that the first word is 'deep'?
Predictive Text Model Comparison
Interpreting Conditional Probabilities in Text
Learn After
Chain Rule for Sequence Probability
Conditional Probability of the Next Token
A model is generating a sequence of words. It has already produced the words 'The', 'quick', 'brown'. According to the principle of autoregressive conditional probability, which expression correctly represents the likelihood that the next word will be 'fox', given the preceding words?
Defining Probability for a Token in a Sequence
A model is generating a sequence of elements (x₀, x₁, x₂, x₃, ...). To calculate the probability of the fourth element (x₃), the model's calculation must be conditioned on the entire preceding subsequence (x₀, x₁, x₂). A simplified model that conditions the probability of x₃ only on the immediately preceding element (x₂) would still be correctly applying the principle of autoregressive conditional probability.