Learn Before
Model Comparison Using Joint Sequence Probability
A researcher is comparing two language models, Model A and Model B, by evaluating the joint probability they assign to complete sentences. The input sequence x is 'The old library held...'. The researcher considers two possible output sequences: y1 = '...a secret.' and y2 = '...many books.'. The models produce the following joint probabilities for the concatenated sequences [x, y]:
0
1
Tags
Ch.5 Inference - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Conditional vs. Joint Probability Objectives in Language Modeling
Relationship Between Joint, Conditional, and Marginal Log-Probabilities of Sequences
General Language Modeling Objective based on Joint Log-Probability
A language model is being used to determine the likelihood of a specific sentence. Let the input sequence
xbe 'The sun is' and the output sequenceybe 'shining brightly'. The notationPr([x, y])represents the probability of the model generating the full, combined sequence. Which statement best analyzes what this probability value signifies?Analysis of Sequence Order on Joint Probability
Conditional Log-Probability via Joint and Marginal Log-Probabilities
Model Comparison Using Joint Sequence Probability