Learn Before
Predictive Inference in Large Language Models
In the context of a Large Language Model (LLM), predictive inference involves selecting an output by optimizing the model's probability distribution. This is formally expressed by combining the general prediction formula, which starts as , with the specific probability function of the model, denoted as . In this notation, represents the model's parameters, and the superscript s may refer to a specific scoring method.

0
1
Tags
Ch.4 Alignment - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Predictive Inference by Maximizing Conditional Probability
Predictive Inference in Large Language Models
A predictive model is designed to generate an output, , that minimizes a given loss function, , which measures the error of the prediction. Which of the following expressions correctly represents the formal definition of the model's predicted value, ?
Formal Prediction Expressions for Different Objectives
A predictive model's output, , is formally defined as the argument that optimizes an objective function. Match each modeling objective to the correct formal expression that represents it.
Learn After
A large language model, with a fixed set of parameters
θand using a specific scoring methods, is performing an inference task. Its goal is to select the single most probable outputŷfrom the entire space of possible outputsy. Which mathematical expression correctly represents this process?Applying Predictive Inference
Deconstructing the Predictive Inference Formula