-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How does HMM reduce to GMM? #2
Comments
Hi, Let's look at the equation (1) describing the transition probability: If we set Just to summarize without any math, after setting I hope this explanation helps. If not, feel free to ask further questions and I will try to write down all of the equations and prove it rigorously. Have a nice day! Dominik |
Ok, now I understand the probability Or do you actually mean that the model |
Hi @domklement, would you give some comments to my reply? |
I appologize for not getting back to you sooner, I had quite busy schedule past few days. I'm going to write the math down today and will reply ASAP. Dominik |
Hi @alephpi I thought of commenting because maybe this is the missing link. The distributions in the states of the HMM are Gaussians. Now, if the states are independent, then you have each observation determined by the categorical distribution and the Gaussian itself so you end up with a Gaussian mixture model. The equations will probably clarify this, but I thought this might help. |
In section 4.2.1 of the paper, you said
I just don't see how the HMM reduces to GMM in this case. If the looping probability becomes 0, you still have non-zero transition probability and hence a complete temporal structure depending on time. How does it become a GMM where no temporal structure exists?
The text was updated successfully, but these errors were encountered: