Learn Before
Example

Token-Level Representation of Input and Output Sequences for a Forward Pass

A forward pass in a language model involves processing concatenated token sequences representing both the input and the output. For instance, an input sequence denoted by tokens x0 x1 x2 x3 is combined with the generated output tokens, such as y1 and y2. The model first processes the input to predict y1, and subsequently processes the combined sequence of input and y1 to predict the next token, y2.

0

1

Updated 2025-10-09

Contributors are:

Who are from:

Tags

Ch.4 Alignment - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences