Learn Before
Concept

Back-Propagation through Random Operations

  • One straightforward way to extend neural networks to implement stochastic transformations of x is to augment the neural network with extra inputs z that are sampled from some simple probability distribution, such as a uniform or Gaussian distribution.
  • Let us consider the operation consisting of drawing samples y from a Gaussian distribution with mean μ and variance σ^2: (refer to equation 20.54 below)
  • We can rewrite the sampling process as transforming an underlying random value z ∼ N(z; 0, 1) to obtain a sample from the desired distribution: (refer to equation 20.55 below)
  • We can now back-propagate through the sampling operation, by regarding it as a deterministic operation with an extra input z.
Image 0

0

1

Updated 2021-08-05

Tags

Data Science

Related