Learn Before
Few-Shot Chain-of-Thought (CoT) Prompting
Few-shot Chain-of-Thought (CoT) prompting is an extension of the one-shot approach that involves including multiple demonstrations, each with its detailed reasoning steps, within the prompt. This provides the model with a richer set of examples to learn from, further improving its ability to generate a reasoned response.
0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Ch.3 Prompting - Foundations of Large Language Models
Related
Few-Shot Chain-of-Thought (CoT) Prompting
A user provides the following text to a large language model:
Question: A cafeteria had 23 apples. If they used 20 to make lunch and bought 6 more, how many apples do they have?
Answer: The cafeteria started with 23 apples. They used 20, so they had 23 - 20 = 3 apples. Then they bought 6 more, so they now have 3 + 6 = 9 apples. The final answer is 9.
Question: Roger has 5 tennis balls. He buys 2 more cans of tennis balls. Each can has 3 tennis balls. How many tennis balls does he have now?
Which statement best analyzes the structure of this prompt and its intended effect on the model's problem-solving process?
Constructing a One-Shot CoT Prompt
Evaluating a Prompt for a Language Model
Example of a One-Shot CoT Prompt with Arithmetic Problems
Learn After
A developer is using a large language model to solve complex logic puzzles that require several steps of reasoning. The model consistently provides incorrect final answers without explaining its process. To improve the model's performance and elicit a step-by-step thought process, which of the following prompt structures would be most effective?
Analyzing Prompt Effectiveness for Multi-Step Calculations
Difficulty of Creating Few-Shot CoT Demonstrations
Improving Model Reasoning for a New Task
Example of a Few-Shot CoT Prompt with Mean Square Demonstration