Learn Before
Concept

Lifelong Distillation

Lifelong learning, which includes continual, continuous, and meta learning, accumulates and transfers the previously learned knowledge into future learning. KD does this effectively without catastrophic forgetting, and meta-learning has been used to determine what and where to make the transfer. Global distillation, knowledge distillation-based lifelong GAN, multi-model distillation, and other KD methods also address the problem of catastrophic forgetting.

0

1

Updated 2022-10-29

Tags

Deep Learning (in Machine learning)