Learn Before
A user provides a language model with the following examples to teach it a new task:
Input: Apple -> Output: A
Input: Banana -> Output: B
Input: Cherry -> Output: C
When the user then provides the new input Input: Grape, the model responds with Output: G. The user was expecting the output to be the full word Grape.
Which of the following best explains why the model produced an unexpected result?
0
1
Tags
Ch.3 Prompting - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Use of Simpler Patterns in Few-Shot Learning
A user provides a language model with the following examples to teach it a new task:
Input: Apple -> Output: AInput: Banana -> Output: BInput: Cherry -> Output: CWhen the user then provides the new input
Input: Grape, the model responds withOutput: G. The user was expecting the output to be the full wordGrape.Which of the following best explains why the model produced an unexpected result?
Constructing an Effective Input-Output Pattern
Predicting Model Behavior from a Few-Shot Pattern