Concept

Basic Monte Carlo Sampling

When a sum or integral of a variable cannot be computed directly, we can use Monte Carlo sampling to approximate the expected mean and variance of the random variable. For example, if s is a random variable, then s=xp(x)f(x)=Ep[f(x)]s = \sum_{x}p(x)f(x) = E_p[f(x)] or s=p(x)f(x)=Ep[f(x)]s = \int p(x)f(x) = E_p[f(x)] We can approximate s by drawing n samples, and compute its estimator: s^=1ni=1nf(x(i))\hat{s} = \frac{1}{n}\sum_{i=1}^{n}f(x^{(i)}) E[s^]=1ni=1nE[f(x(i))]=1ni=1ns=s\mathbb{E}[\hat{s}] = \frac{1}{n}\sum_{i=1}^{n}\mathbb{E}[f(x^{(i)})] = \frac{1}{n}\sum_{i=1}^{n}s = s Similarly, we can obtain expected sampling variance. We can then compute the variance estimator:

Var[sn^]=1n2i=1nVar[f(x(i))]=Var[f(x)]n\mathrm{Var}[\hat{s_n}] = \frac{1}{n^2}\sum_{i=1}^{n}\mathrm{Var}[f(x^{(i)})] = \frac{\mathrm{Var}[f(x)]}{n}

By Central Limit Theorem, the random variable thus follows Normal Distribution with mean s and variance Var[f(x)]n\frac{\mathrm{Var}[f(x)]}{n}.

0

2

Updated 2021-07-15

References


Tags

Data Science