Learn Before
Concept

Reducible and irreducible errors in a prediction problem

The error that we introduce when ff as f^\hat{f} is called "reducible" error because it is possible to reduce this error by using more suitable statistical models to estimate ff. Assuming that we can reduce the reducible error to 0,     Y^=f(X)\implies \hat{Y}=f(X), but we still suffer from the irreducible error, because: Y=f(X)+ϵ    Y=Y^+ϵY=f(X)+\epsilon \implies Y=\hat{Y}+\epsilon And we know that ϵ\epsilon is independent of XX. So, regardless of the statistical techniques that we use in estimating ff, we cannot reduce the irreducible error. E(YY^)2=E(f(x)+ϵf^(X))2=E(Y-\hat{Y})^2=E(f(x)+\epsilon-\hat{f}(X))^2=

(f(x)+\epsilon-\hat{f}(X))^2}_{ Reducible }+\underbrace{ Var(\epsilon)}_{ Irreducible }$$ Where $E(Y-\hat{Y})^2$ is the expected value of the squared error in predicting $Y$ by $\hat{Y}$ and $Var(\epsilon)$ is the variance of $\epsilon$.

0

4

Updated 2020-02-22

Tags

Data Science