Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add an implementation of forward-backward algorithm for HMM models #330

Open
yebai opened this issue Feb 7, 2022 · 6 comments
Open

Add an implementation of forward-backward algorithm for HMM models #330

yebai opened this issue Feb 7, 2022 · 6 comments

Comments

@yebai
Copy link
Member

yebai commented Feb 7, 2022

Hidden Markov Models are quite common in time series analysis. Since they involve discrete variables, HMC is not always appropriate, although we can still apply HMC for continuous parameters of an HMM model. It would be nice to add

  1. a customised HMM distribution that accepts transition_matrix, emit_parameters, and emit_distribution as inputs, and returns a parameterised distribution.
  2. an implementation of the forward-backwards algorithm, which computes the marginal probability of transimition_matrix, emit_parameters given some data.

References: https://mc-stan.org/docs/2_18/stan-users-guide/hmms-section.html

@yebai
Copy link
Member Author

yebai commented Feb 9, 2022

Maybe we can make use of https://github.com/maxmouchet/HMMBase.jl - if we can use this, then this is mostly about writing a new tutorial on Bayesian HMMs with marginalised latent states.

@yebai yebai transferred this issue from TuringLang/Turing.jl Nov 13, 2022
@rossviljoen
Copy link

This library just released: https://github.com/probml/dynamax - might provide some inspiration

@JasonPekos
Copy link
Member

Possible with the newly released: https://github.com/gdalle/HiddenMarkovModels.jl

Something like:

@model function example_hmm_marginalized(N, K, y)
    mu ~ MvNormal([3, 10], I)
    theta1 ~ Dirichlet(softmax(ones(K)))
    theta2 ~ Dirichlet(softmax(ones(K)))
    θ = vcat(theta1', theta2')

    hmm = HMM(softmax(ones(K)), θ, [Normal(mu[1], 1), Normal(mu[2], 1)])
    _, filtered_likelihood = forward(hmm, y)

    Turing.@addlogprob! only(filtered_likelihood)
end

Example gist with some quick attempts at validating this against PosteriorDB reference draws. Seems correct:
https://gist.github.com/JasonPekos/82be830e4bf390fd1cc2886a7518aede

@yebai
Copy link
Member Author

yebai commented Apr 13, 2024

Thanks @JasonPekos — would you like to turn this example into a new tutorial?

@yebai
Copy link
Member Author

yebai commented Apr 13, 2024

Cc @gdalle

@cgbotta cgbotta removed their assignment Apr 13, 2024
@JasonPekos
Copy link
Member

Sure, I'll get around to it in a bit :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants