Relation

Example of Biased Estimators

For a sample of size mm from a Normally distributed random variable with paramters μ\mu and σ\sigma, the sample variance, which is the sum of the squared deviations from the mean over the number of samples, the estimator E[σ^m2]=E[1mi=1m(x(i)μ^m)2]\mathbb{E}[\hat{\sigma'}^2_m] = \mathbb{E}\left[\frac{1}{m}\sum_{i=1}^m \left(x^{(i)} - \hat{\mu}_m\right)^2\right] Is a biased estimator for the variance σ2\sigma^2.

To see why, consider that the sample variance, E[σ^m2]=E[1m1i=1m(x(i)μ^m)2]\mathbb{E}[\hat{\sigma}^2_m] =\mathbb{E}\left[\frac{1}{m - 1}\sum_{i=1}^m \left(x^{(i)} - \hat{\mu}_m\right)^2\right] is an unbiased estimator for the variance σ2\sigma^2. Then bias(σ^m2=E[1m1i=1m(x(i)μ^m)2]σ2=0\textrm{bias}(\hat{\sigma}_m^2 = \mathbb{E}\left[\frac{1}{m - 1}\sum_{i=1}^m \left(x^{(i)} - \hat{\mu}_m\right)^2\right] - \sigma^2 = 0, so E(σ^m2)=E[1m1i=1m(x(i)μ^m)2]=σ2E[1mi=1m(x(i)μ^m)2]=E(σ^m2) \mathbb{E}(\hat{\sigma}_m^2) = \mathbb{E}\left[\frac{1}{m - 1}\sum_{i=1}^m \left(x^{(i)} - \hat{\mu}_m\right)^2\right] = \sigma^2 \not = \mathbb{E}\left[\frac{1}{m}\sum_{i=1}^m \left(x^{(i)} - \hat{\mu}_m\right)^2\right] = \mathbb{E}(\hat{\sigma'}_m^2), since m1mm-1 \not = m.

0

1

Updated 2021-05-24

References


Tags

Data Science