HMM initialization: don't use equal initial probabilities. #828

Merged
merged 2 commits into from Dec 22, 2016

Projects

None yet

1 participant

@rcurtin
Member
rcurtin commented Dec 12, 2016

That can cause training to fail sometimes. Instead, optimization seems to perform better when using random intiialization.

This is a fix that came out of some debugging in the IRC channel.

rcurtin added some commits Dec 12, 2016
@rcurtin rcurtin Don't use equal initial probabilities.
That can cause training to fail sometimes.  Instead, optimization seems to
perform better when using random intiialization.
d000a5a
@rcurtin rcurtin Remove random initialization since it is done by default now.
c76f5ac
@rcurtin rcurtin merged commit c76f5ac into mlpack:master Dec 22, 2016

1 check failed

continuous-integration/appveyor/pr AppVeyor was unable to build non-mergeable pull request
Details
@rcurtin rcurtin deleted the rcurtin:hmm-initialization branch Dec 22, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment