Learn Before
Predicting Next-Word Likelihood
A language model is given the following context: 'The children were excited to go to the park and play on the...' The model needs to predict the next word. Consider the four words below. Rank them from most likely to least likely to be generated by a well-trained language model. Briefly explain your reasoning for the ranking, referencing the concept of the conditional probability of a word given its context.
Words: swing, moon, algorithm, house
0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Ch.5 Inference - Foundations of Large Language Models
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Fundamental LLM Training Objective
LLM Policy as a Probability Distribution
A language model is given the context: 'The chef carefully added the final, crucial ingredient to the simmering stew: a pinch of...'. The model must predict the next word. Below are the conditional probabilities,
Pr(next_word | context), calculated by two different models for four possible next words.Next Word Model A Probability Model B Probability salt 0.65 0.20 concrete 0.02 0.45 laughter 0.03 0.15 thyme 0.30 0.20 Based on this data, which of the following statements is the most accurate analysis of the models' understanding of the context?
Mathematical Notation for Text Generation Probability
Evaluating Language Model Suitability
Predicting Next-Word Likelihood
Loss Function for Language Modeling