Learn Before
The Coin Example of EM Algorithm
The coin example. Assume that we have two coins, coin A and coin B(for short, A and B). There exist two different possibility distributions of A and B about whether heads or tails come down. We have no idea about the frequency about selecting A and B as well. We only know about the the results of tossing coins. To approximate this process, we assume out the possibility distribution of A and B and initialize them randomly. Then we use them to approximate the probability of selecting A or B for the coin used in the game. It's the expectation step. Then we approximate new probability distributions of A and B according to the probability of selecting coins based on maximum likelihood principle. It's the maximization step. We repeatedly do these two steps until one of them converges.
0
3
Tags
Data Science
Related
Application Scenarios of Using EM Algorithm
Jensen's Inequality
Why is it hard to approximate latent variables?
The relationship between EM algorithm with Jensen's Inequality
The Coin Example of EM Algorithm
The student example of EM algorithm
Convergence of EM Algorithm
Global Optimum of EM Algorithm
A Helpful Presentation Explaining mathematical Details and Applications of EM Algorithm Provided by Berkeley
A Coordinate Ascent View of Understanding EM Algorithm
E Step
M step