Concept

Long Text Generation from a Long Context

This category of long sequence modeling involves complex tasks where both the input context and the resulting generated output are extensive token sequences. Represented by the text generation probability notation Pr(yx)\Pr(\mathbf{y}|\mathbf{x}), this describes scenarios where both the context x\mathbf{x} and the generated text y\mathbf{y} are long sequences.

0

1

Updated 2026-04-22

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences