-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement wrapper for diagonally-constrained GMM HMMs #1658
Comments
I'm working this now. :) #1666. |
@KimSangYeon-DGU are you still working on this issue? else, I can take it up :) |
@prateeksingh0001 |
This issue has been automatically marked as stale because it has not had any recent activity. It will be closed in 7 days if no further activity occurs. Thank you for your contributions! 👍 |
I want to keep this open. :) |
This issue has been automatically marked as stale because it has not had any recent activity. It will be closed in 7 days if no further activity occurs. Thank you for your contributions! 👍 |
I want to keep this open. |
Agreed, let's add the keep open label. |
Since #1666 is merged this is done. 👍 |
This is a follow-up to issue #1499. Specifically, there I wrote
But I was too optimistic, and I don't think I have time to do this. However I thought it might be a nice task for someone looking to contribute. So I will try to detail the steps to do this below. Overall, our goal is to make it easy for users to train Hidden Markov Models with Gaussian Mixture Models where each Gaussian has the constraint that the covariance matrix is diagonal. Right now, we provide (through the bindings)
discrete
,gaussian
, andgmm
butgmm
has a full covariance matrix, not a diagonal one. So you could see this proposeddiagonal_gmm
as something that lives "between"gaussian
andgmm
in terms of expressivity.Make sure you're at least reasonably familiar with HMMs and GMMs. I'd suggest playing with the
mlpack_hmm_train
,mlpack_hmm_viterbi
,mlpack_hmm_loglik
andmlpack_hmm_generate
command-line programs (or corresponding Python bindings) to see how a user might use these. Also familiarize yourself with themlpack::hmm::HMM
class andmlpack::gmm::GMM
class. It's worth looking at the documentation insrc/mlpack/methods/hmm/hmm.hpp
andsrc/mlpack/methods/gmm/gmm.hpp
, as well as the tests insrc/mlpack/tests/hmm_test.cpp
andsrc/mlpack/tests/gmm_test.cpp
.Note that we can train a Gaussian mixture model with a diagonal constraint with the following code:
But that template parameter
FittingType
toGMM::Train()
is not a class member. In addition, because ofEMFit::ArmadilloGMMWrapper()
, whenGMM::Train()
is called withDiagonalConstraint
, we internally use the Armadillogmm_diag
support. We should write a new classDiagonalGMM
that entirely wraps thegmm_diag
class, but provides the same API as the existingGMM
class.Once you have
DiagonalGMM
, write some tests for it insrc/mlpack/tests/gmm_test.cpp
to make sure that a GMM trained in this way has diagonal covariance. You might consider adaptingGMMTrainEMOneGaussian
.Now, we can turn to the HMM implementation. Modify
src/mlpack/methods/hmm/hmm_model.hpp
so that it can also supportHMM<DiagonalGMM>
. One thing that you will have to do is make sure that theserialize()
function here is backwards-compatible; seeLSHSearch::serialize()
insrc/mlpack/methods/lsh/lsh_search_impl.hpp
to get an idea of how the secondconst unsigned int version
parameter can be used to make sure things are reverse-compatible.Adapt
hmm_train_main.cpp
,hmm_viterbi_main.cpp
,hmm_loglik_main.cpp
, andhmm_generate_main.cpp
to also support an HMM typediagonal_gmm
.Adapt the tests in
src/mlpack/tests/main_tests/hmm_*_test.cpp
to also test thediagonal_gmm
HMM type.Using
mlpack_hmm_train
(orhmm_train()
in Python), ensure that a diagonal GMM HMM trains faster than a regular GMM HMM on some example dataset. If it does, then this all behaves as expected and I think we are done. 👍The text was updated successfully, but these errors were encountered: