Learn Before
Concept

State in the Context of LLMs

In language modeling, a state at a specific time step, denoted as ss, is defined as the sequence of tokens observed up to that point. This sequence serves as the context the model utilizes to predict the subsequent token. For instance, when predicting the next token at time step tt, the state can be mathematically defined as (x,y<t)(\mathbf{x}, \mathbf{y}_{< t}), where x\mathbf{x} represents the initial input and y<t\mathbf{y}_{< t} represents the generated tokens so far.

0

1

Updated 2026-05-01

Contributors are:

Who are from:

Tags

Ch.4 Alignment - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences