Learn Before
Concept
Cross-Model Distillation
Transferring knowledge between modalities is important because data/labels for every modality are not available during training/testing. An example is transferring the knowledge of a teacher model pre-tained on one modality to the student with a new unlabeled input modality, using pair-wise sample registration. Features obtained from the teacher are used for supervised training of the student. This performs well in visual recognition tasks in cross-modal scenarios.
0
1
Updated 2022-10-29
Tags
Deep Learning (in Machine learning)