Text Generation Based on Long Context
This category of long sequence modeling involves tasks where a language model produces text based on an input context that is an extensive sequence. In terms of the text generation probability notation , this corresponds to scenarios where the context is a long sequence, while the generated text is typically shorter.
0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Text Generation Based on Long Context
Long Text Generation
An AI-powered assistant is tasked with summarizing a 200-page research paper into a single, concise paragraph. In the context of text generation probability, represented as Pr(y|x), how would this task be classified based on the relative lengths of the input and output sequences?
Long Text Generation from a Long Context
Match each text generation task with the description that best represents the relationship between the length of its input context (x) and its generated output (y).
Classifying a Code Refactoring Task
Learn After
Example of Long-Context Text Generation: Summarization
Example of Long-Context Task: Code Functionality Outlining
A research team is evaluating different computational models for their ability to handle various language tasks. Which of the following tasks is the best example of a problem where a model must generate new text based on an input that is an exceptionally long sequence?
Analyzing a Business Intelligence Task
Characterizing a Class of Language Model Problems