Learn Before
Concept

Attention-Based Distillation

Attention reflects the neuron activations of convolutional neural networks, and different attention transfer mechanisms are defined for distilling knowledge from the teacher network. The main task is defining the attention maps for feature embedding in neural network layers, because embedding knowledge is transferred using attention map functions.

0

1

Updated 2022-10-29

Tags

Deep Learning (in Machine learning)