Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can hmmlearn (HMMGMM) be used for supervised learning? #109

Closed
andrey-vinokurov-1991 opened this issue Apr 20, 2016 · 5 comments
Closed

Comments

@andrey-vinokurov-1991
Copy link

andrey-vinokurov-1991 commented Apr 20, 2016

I am interested in HMM/GMM. I would like to use it for supervised learning, i.e. I have a sequence and labels for the sequence. Seqlearn allows us to use just MultinomialHMM. How can I implement hmmgmm for supervised learning using hmmlearn and seqlearn?
I want to solve this task http://cslu.ohsu.edu/~bedricks/courses/cs655/hw/hw4/hw4.html

@superbobry
Copy link
Member

Hi, no hmmlearn does not currently target supervised learning. Have a look at seqlearn instead. It does not implement a GMMHMM, but I think you get around it by combining sklearn.mixture.GMM with sequlearn.hmm.MultinomialHMM.

I've skimmed through the task you've linked and noticed some minor inaccuracies regarding the hmmlearn API. Specifically hmmlearn does not have an implementation of GMM, it re-uses the one from sckit-learn.

@luigivieira
Copy link

Hi there @superbobry.

I found this discussion because I would like to use a GMMHMM in a supervised manner. I understand that I can not use seqlearn because my observations are not discrete. Your suggestion of using sklearn.mixture.GMM seems very interesting, but I don't even know how to start: I am new to HMM and GMM, and the documentation of scikit's GMM seems to indicate that the GMM is fit pretty much in the same way that it is done with hmmlearn: in a non-supervised fashion (i.e. without providing the labels for the samples).

Any help you could offer would be very appreciated. :)

@pls331
Copy link

pls331 commented Apr 4, 2017

@luigivieira have you get any idea of doing that? I was doing similar thing (continuous observation as well), using GMM of sklearn and HMM (and multivariate Gaussian distribution for modeling observation) in pomogranate. (But this gmm-hmm model does not work well for my case.)
Did you find any other possible alternative solution to do this?

@luigivieira
Copy link

@RoshanPAN Nope, I didn't find any alternative solution to do this. :(

@chananshgong
Copy link

Hi @AWin9 , I did something similar by feeding the model pretrained gmm per state the transition matrix and the start probabilities. It would look something like:

        model = GMMHMM(n_mix=self.n_mix,
                       n_components=n_states,
                       init_params='',
                       params='')
        # starting probabilities
        model.startprob_ = start_prob
        # transition matrix
        model.transmat_ = prob_trans_mat
        # gmm parameters
        model.gmms_ = gmm_per_state

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants