Learn Before
Concept

Zero/Few-Shot Learning

Zero-shot and few-shot learning are approaches for adapting a pre-trained model to new tasks using either no labeled examples (zero-shot) or a minimal number of examples (few-shot). This capability is typically unlocked through prompting, where the task is described to the model in natural language. This allows the model to generalize its vast pre-trained knowledge to solve problems without requiring extensive, task-specific fine-tuning.

0

1

Updated 2026-04-29

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences