Learn Before
Concept
Varieties of Online Distillation
- In deep mutual learning, multiple neural networks work collaboratively. Any network can be the teacher or student model during training.
- Co-distillation trains multiple models in parallel with the same architecture. Any model is trained by transferring knowledge from the other models.
0
1
Updated 2022-10-29
Tags
Deep Learning (in Machine learning)