Learn Before
Concept

Self Distillation Methods

  • Knowledge from deeper sections of the network distilled into shallow sections.
  • Self attention distillation method: Used for lane detection, the network uses attention maps of its own layers as dilation targets for its lower layers.
  • Snapshot distillation: A supervised training process whose knowledge in earlier epochs (teacher) is transferred its later epochs (student)
  • Self-knowledge distillation method: Matching predicted probabilities (feature representations of training model) to reflect data similarities in feature embedding space

0

2

Updated 2022-10-29

Tags

Deep Learning (in Machine learning)

Related