Learn Before
Instructing LLMs with Detailed Descriptions
When a task is challenging to define using an attribute-based format, Large Language Models can be instructed using clear, comprehensive descriptions. A common strategy involves assigning a specific role or persona to the model and providing adequate context, which guides the model to adopt the desired perspective and constraints when generating its response.
0
1
Tags
Foundations of Large Language Models
Ch.3 Prompting - Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Example of a Complete Prompt for Polarity Classification
Components of an Instruction-based Prompt
Zero-Shot Learning with LLMs
Example of a Zero-Shot Prompt for Polarity Classification (Negative Sentiment)
Examples of Instruction-based Prompts for Polarity Classification
Using Descriptive Prompts for Novel Classification Tasks
Challenge of Prompting LLMs for Many-Category Classification
Example of a Zero-Shot Prompt for Polarity Classification (Positive Sentiment)
Example of a Zero-Shot Prompt for Polarity Classification (Positive Sentiment on Food)
Adapting Prompt Detail to an LLM's Task Familiarity
A developer needs a large language model to classify incoming customer support tickets. The goal is to sort each ticket into one of three specific categories: 'Technical Issue', 'Billing Inquiry', or 'General Feedback'. Which of the following prompts is best structured to achieve this task reliably and consistently?
Diagnosing Ineffective Prompt Instructions
Crafting an Instruction for a Novel Task
Instructing LLMs with Detailed Descriptions