You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It's entirely possible that this issue stems from my lack of understanding of this model or of HDP/LDA in general:
I've written a method that cycles through different hyperparameter values and trains the model so that you can see how the output changes with different values.
I just ran the method in PyCharm, and I saw some strange behavior: I input alpha as 10 ** -4 (0.0001), eta as 10 ** -1 (0.1), and gamma as 10 ** 0 (1). However, once the model was trained, I got the following values:
It's entirely possible that this issue stems from my lack of understanding of this model or of HDP/LDA in general:
I've written a method that cycles through different hyperparameter values and trains the model so that you can see how the output changes with different values.
I just ran the method in PyCharm, and I saw some strange behavior: I input alpha as 10 ** -4 (0.0001), eta as 10 ** -1 (0.1), and gamma as 10 ** 0 (1). However, once the model was trained, I got the following values:
hdp.alpha = 7.38756571081467e-05
hdp.eta = 0.10000000149011612
hdp.gamma = 3.130246162414551
Is this normal? Should those values be changing once the model has trained?
The text was updated successfully, but these errors were encountered: