Long Text Generation from a Long Context
This category of long sequence modeling involves complex tasks where both the input context and the resulting generated output are extensive token sequences. Represented by the text generation probability notation , this describes scenarios where both the context and the generated text are long sequences.
0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Text Generation Based on Long Context
Long Text Generation
An AI-powered assistant is tasked with summarizing a 200-page research paper into a single, concise paragraph. In the context of text generation probability, represented as Pr(y|x), how would this task be classified based on the relative lengths of the input and output sequences?
Long Text Generation from a Long Context
Match each text generation task with the description that best represents the relationship between the length of its input context (x) and its generated output (y).
Classifying a Code Refactoring Task
Learn After
Example of Long Text Generation Based on Long Context: Document Translation
A research team is evaluating different text generation tasks for a new language model. Which of the following scenarios best represents a task where the model must process an extensive, detailed input sequence to produce an equally extensive and detailed output sequence?
AI Tool for Legal Document Analysis
A user provides a 50-page scientific paper to a language model and asks it to 'Generate a one-sentence summary of the main finding.' This task is a prime example of a problem where both the input context and the generated output are long sequences.