You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is a bug in hmm.py. If you run hmm.py directly, you get 16 hmm_models for each number. The reason is that the code puts "model = hmm.GaussianHMM(n_components=4, covariance_type='diag', n_iter=1800)" into the second loop. It should be moved to the first layer of the loop
The text was updated successfully, but these errors were encountered:
Thanks very much for your nice corrections. In the first version of the code, I only implemented DTW and forgot to update GMM and HMM after implementing them. I have fixed bugs and uploaded the latest version of the code, and thank you again for your attention.
Hi, thanks for your reply, but it seems unreasonable to assemble the MFCC features of multiple samples to train a model. In fact, you can pass in the parameter "lengths" in "model.Fit", which can continuously update the model with multiple samples
Take the digit_0 as an example:
model.fit(X, lengths=X_lenth)
X_lenth is like [20 13 27 17 8 23 17 26 27 16 27 7 25 20 23 25]
First of all, thank you for open source
There is a bug in hmm.py. If you run hmm.py directly, you get 16 hmm_models for each number. The reason is that the code puts "model = hmm.GaussianHMM(n_components=4, covariance_type='diag', n_iter=1800)" into the second loop. It should be moved to the first layer of the loop
The text was updated successfully, but these errors were encountered: