Learn Before
Concept
Distillation Loss
The distillation loss of relation-based knowledge, based on the relations of feature maps is
-
-
are feature maps of teacher and student models
-
are pairs of feature maps chosen from the teacher
-
are pairs of feature maps chosen from the student
-
are similarity functions for Pais of feature maps from the models
-
is the correlation function between teacher and student feature maps
0
1
Updated 2022-10-22
Tags
Deep Learning (in Machine learning)
Data Science