New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HMM initialization: don't use equal initial probabilities. #828

Merged
merged 2 commits into from Dec 22, 2016

Conversation

Projects
None yet
1 participant
@rcurtin
Member

rcurtin commented Dec 12, 2016

That can cause training to fail sometimes. Instead, optimization seems to perform better when using random intiialization.

This is a fix that came out of some debugging in the IRC channel.

rcurtin added some commits Dec 12, 2016

Don't use equal initial probabilities.
That can cause training to fail sometimes.  Instead, optimization seems to
perform better when using random intiialization.

@rcurtin rcurtin merged commit c76f5ac into mlpack:master Dec 22, 2016

1 check failed

continuous-integration/appveyor/pr AppVeyor was unable to build non-mergeable pull request
Details

@rcurtin rcurtin deleted the rcurtin:hmm-initialization branch Dec 22, 2016

@rcurtin rcurtin added this to the mlpack 2.2.0 milestone Mar 17, 2017

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment