Learn Before
Concept
Adversarial Generator
Adversarial Generator is trained to generate synthetic data, which is either added to or used as the training set. The distillation loss is
- are teacher and student model outputs
- G(z) is the training samples generated by generator G given random input vector z
- is a distillation loss to force the match between predicted vs ground truth probability distributions, e.g. cross entropy or Kullback-Leibler divergence loss.
0
1
Updated 2022-10-29
Tags
Deep Learning (in Machine learning)
Data Science