Learn Before
Relation

Information Criteria

This approach uses information criteria to calculate an expected score out-of-sample. It constructs a theoretical estimate of the relative out-of-sample KL divergence.

The best-known information criterion is the Akaike Information Criterion (AIC). This tells us is the dimensionality of the posterior distribution is a natural measure of the model's overfitting tendency.

However, AIC is an approximation only reliable when:

  • The prior are flat or overwhelmed by the likelihood
  • The posterior distribution is approximately multivariate Gaussian
  • The sample size (N) is much greater than the number of parameters (k)

0

1

Updated 2021-07-27

Tags

Bayesian Statistics

Statistics

Data Science