Learn Before
Relation

Gibbs Sampling

  • A variant of Metropolis Hastings algorithm
  • More general and more efficient than standard Metropolis – uses fewer steps to get a good estimate of the posterior distribution
  • Uses adaptive proposals:
    • o The proposed parameter values adjust intelligently, depending upon the current values.
    • o Uses conjugate pairs to compute adaptive proposals (prior distributions and likelihoods)

Limitations:

  • Becomes very inefficient as the models become more complex.
  • The model getting stuck in regions of high correlation in the posterior. Complex models often have highly correlated parameters, these parameters cause a narrow ridge of high probability combinations – thus the model will get stuck in these regions for a long time.
  • Concentration of Measure - Any Markov chain approach that samples individual parameters in individual steps is going to get stuck, once the number of parameters grows sufficiently large.

0

1

Updated 2021-08-09

Contributors are:

Who are from:

Tags

Bayesian Statistics

Statistics

Data Science