Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Alpha and gamma values not what are being fed in as arguments #71

Closed
jkaupanger opened this issue Aug 14, 2020 · 1 comment
Closed

Alpha and gamma values not what are being fed in as arguments #71

jkaupanger opened this issue Aug 14, 2020 · 1 comment

Comments

@jkaupanger
Copy link

It's entirely possible that this issue stems from my lack of understanding of this model or of HDP/LDA in general:

I've written a method that cycles through different hyperparameter values and trains the model so that you can see how the output changes with different values.

I just ran the method in PyCharm, and I saw some strange behavior: I input alpha as 10 ** -4 (0.0001), eta as 10 ** -1 (0.1), and gamma as 10 ** 0 (1). However, once the model was trained, I got the following values:

hdp.alpha = 7.38756571081467e-05
hdp.eta = 0.10000000149011612
hdp.gamma = 3.130246162414551

Is this normal? Should those values be changing once the model has trained?

@bab2min
Copy link
Owner

bab2min commented Aug 15, 2020

Yes, the model automatically optimizes its alpha and gamma value.
If you want to fix these values, please set hdp.optim_interval = 0 before train.

@bab2min bab2min closed this as completed Dec 20, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants