Activity (Process)

Context Shifting in Auto-Regressive Generation

In auto-regressive language modeling, text generation is an iterative process. At every step, after the model generates an output token, this new token is appended to the existing input sequence. This updated, longer sequence then serves as the context for predicting the subsequent token, effectively shifting the model's focus one position forward.

0

1

Updated 2025-10-10

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences