Learn Before
Concept

Self Distillation

Self-distillation is a special case of online distillation where the teacher and student use the same networks. Essentially, the student learns knowledge on its own.

0

1

Updated 2022-10-29

Tags

Deep Learning (in Machine learning)

Related
Learn After