Reframing a Traditional NLP Task
Consider the task of 'Named Entity Recognition,' which involves identifying and classifying entities like names of people, organizations, and locations within a sentence. For example, in the sentence 'Apple was founded by Steve Jobs in Cupertino,' the entities are 'Apple' (Organization), 'Steve Jobs' (Person), and 'Cupertino' (Location). Describe how you would reframe this task as a text generation problem for a large language model to solve, using the provided example sentence.
0
1
Tags
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Ch.2 Generative Models - Foundations of Large Language Models
Ch.3 Prompting - Foundations of Large Language Models
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Example of Reframing Text Classification as Text Generation
Instruction-based Prompts
Few-Shot Learning
Alternative Prompt Formats for Machine Translation
Text Classification in NLP
Versatility of Prompt Templates
Grammaticality Judgment as a Binary Classification Task for LLMs
Formal Definition of LLM Inference
Illustrative Purpose of Prompting Examples
The paradigm of using Large Language Models (LLMs) allows for many different NLP tasks (e.g., translation, sentiment analysis) to be reframed as a text generation problem. What is the fundamental advantage of this approach over traditional methods that required building a separate, specifically trained model for each individual task?
Reframing a Traditional NLP Task
Choosing an NLP Development Strategy