Learn Before
Refining a Prompt for Feature Request Identification
Based on the case study below, rewrite the researcher's prompt to solve the problem of inconsistent and verbose model outputs. Your revised prompt must explicitly define the required answer format and the meaning of each possible response.
0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Example of Defining Answer Semantics for Grammaticality Judgment
Example of Defining Category Semantics in a Polarity Classification Prompt
A data scientist is using a language model to classify customer feedback into 'Bug Report' or 'Feature Request'. Their initial prompt is:
Feedback: 'The app crashes when I try to upload a photo.' What kind of feedback is this?They observe that the model's outputs are inconsistent, including responses like 'This is a bug report,' 'It seems like a bug,' and 'The user is reporting a problem with the app.' Which of the following revised prompts best addresses this inconsistency by explicitly defining the required output format and the meaning of the categories?Improving Prompt Specificity for Automated Data Extraction
Refining a Prompt for Feature Request Identification
Example of a Constraint-First Prompt for Grammaticality Judgment