Concept

Text Generation Based on Long Context

This category of long sequence modeling involves tasks where a language model produces text based on an input context that is an extensive sequence. In terms of the text generation probability notation Pr(yx)\Pr(\mathbf{y}|\mathbf{x}), this corresponds to scenarios where the context x\mathbf{x} is a long sequence, while the generated text y\mathbf{y} is typically shorter.

0

1

Updated 2026-04-22

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences