Learn Before
One-Shot Transfer Learning
One-shot transfer learning is an extreme form of transfer learning where only one labeled example per class in the training dataset is needed to infer the labels of new instances. This is possible because the model's representation learns to cleanly separate the underlying classes during the initial pre-training stage.
0
1
Contributors are:
Who are from:
Tags
Transfer Learning in Deep Learning
Feature Learning (Representation Learning)
Data Science
Foundations of Large Language Models Course
Computing Sciences
Learn After
Adapting a Model for a Rare Class
A large model, pre-trained on a massive, diverse dataset, is tasked with a new classification problem: identifying a specific, newly discovered species of plant. The model is only given a single labeled image of this new plant. Which statement best analyzes the underlying principle that allows the model to potentially succeed at identifying other images of this plant?
Feasibility of One-Shot Learning for a Niche Task
A model's success in a one-shot learning scenario (e.g., classifying a new type of animal after seeing only one image) is fundamentally enabled by the model having previously learned a rich, well-organized feature space where new categories can be easily distinguished, even if it has never seen that specific category before.