Learn Before
Concept
Knowledge Distillation Methods
Individual knowledge distillation can be implemented by directly distilling the teacher’s individual soft targets into the student. This transferred knowledge contains feature information and mutual relations of data samples.
0
1
Updated 2022-10-22
Tags
Deep Learning (in Machine learning)
Data Science