-
Notifications
You must be signed in to change notification settings - Fork 99
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add an implementation of forward-backward algorithm for HMM models #330
Comments
Maybe we can make use of https://github.com/maxmouchet/HMMBase.jl - if we can use this, then this is mostly about writing a new tutorial on Bayesian HMMs with marginalised latent states. |
This library just released: https://github.com/probml/dynamax - might provide some inspiration |
Possible with the newly released: https://github.com/gdalle/HiddenMarkovModels.jl Something like: @model function example_hmm_marginalized(N, K, y)
mu ~ MvNormal([3, 10], I)
theta1 ~ Dirichlet(softmax(ones(K)))
theta2 ~ Dirichlet(softmax(ones(K)))
θ = vcat(theta1', theta2')
hmm = HMM(softmax(ones(K)), θ, [Normal(mu[1], 1), Normal(mu[2], 1)])
_, filtered_likelihood = forward(hmm, y)
Turing.@addlogprob! only(filtered_likelihood)
end Example gist with some quick attempts at validating this against PosteriorDB reference draws. Seems correct: |
Thanks @JasonPekos — would you like to turn this example into a new tutorial? |
Cc @gdalle |
Sure, I'll get around to it in a bit :) |
Hidden Markov Models are quite common in time series analysis. Since they involve discrete variables, HMC is not always appropriate, although we can still apply HMC for continuous parameters of an HMM model. It would be nice to add
transition_matrix
,emit_parameters
, andemit_distribution
as inputs, and returns a parameterised distribution.transimition_matrix
,emit_parameters
given some data.References: https://mc-stan.org/docs/2_18/stan-users-guide/hmms-section.html
The text was updated successfully, but these errors were encountered: