Learn Before
Concept
Online Distillation
In online distillation, the teacher and student models are updated simultaneously. It seeks to improve the student performance when a large-capacity high performance teacher model is not available.
0
1
Updated 2022-10-29
Tags
Deep Learning (in Machine learning)