Learn Before
Concept

Adversarial Generator

Adversarial Generator is trained to generate synthetic data, which is either added to or used as the training set. The distillation loss is LKD=LG(Ft(G(z)),Fs(G(z)))L_{KD} = L_G(F_t(G(z)), F_s(G(z)))

  • Ft(.)andFs(.)F_t(.) and F_s(.) are teacher and student model outputs
  • G(z) is the training samples generated by generator G given random input vector z
  • LGL_G is a distillation loss to force the match between predicted vs ground truth probability distributions, e.g. cross entropy or Kullback-Leibler divergence loss.

0

1

Updated 2022-10-29

Tags

Deep Learning (in Machine learning)

Data Science