-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
It seems "top1" loss function do not work in this tf implementation? #3
Comments
Hi @IcyLiGit , Weiping |
@Songweiping RMSprop and adam with cross-entropy loss and softmax activation function may work in your implementation. However, top1 and bpr can just produce a result of 0.48 (not 0.6 in paper) , and it seems loss value decrease quicker in tf. (Maybe caused by overfitting? But I can not find the differences of these two implementation....) |
It seems that TF converges faster than Theano. So how about:
Weiping |
I find the similar issue too and it's not overfitting, I have check the recall on training data |
It seems "top1" loss function do not work in this tf implementation?
I follow the parameters setting like that it theano implementation. However, I just get a result of 0.48 and 0.17. (Original is 0.59 and 0.23)
I use tensorflow 1.2.
The text was updated successfully, but these errors were encountered: