Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementation and formula problems about RSGAN. #184

Closed
iYiYaHa opened this issue Dec 20, 2019 · 6 comments
Closed

Implementation and formula problems about RSGAN. #184

iYiYaHa opened this issue Dec 20, 2019 · 6 comments

Comments

@iYiYaHa
Copy link

iYiYaHa commented Dec 20, 2019

Hi. I have some problems with both the implementation and several formulas in your RSGAN paper.

  1. As you present the whole min-max loss of RSGAN in Eq. 9, however, in Eq. 8, the generator's loss is defined without the latter part of Eq. 9, log(sigmoid(x_uz-x_uj)), which is the difference between the scores of generated items and negative items. I think, to reach an equilibrium, the loss of the generator should maximize the whole loss of Eq.9. Why do you design the generator loss without the latter part of Eq.9?
  2. As in your implementation, the loss of the generator is defined without the log part and is multiplied by 30. Is it implemented in this way to avoid too small gradients brought by the log function? And is this so-called implementation trick by others?

Thanks for your time. Your reply always helps a lot.

@Coder-Yu
Copy link
Owner

Coder-Yu commented Dec 20, 2019

  1. The aim of generator is to select informative items consumed by reliable friends and so it just needs to narrow the gap between positive examples and generated examples. The latter part of Eq.9 is trivial for generator. Maximizing the whole loss would probably lead to a performance degradation.
  2. You are right.

@iYiYaHa
Copy link
Author

iYiYaHa commented Dec 29, 2019

Hi, I have tried to run the RSGAN. Could you please tell me what does each line of the file "*_n.txt" used in readNegativeFeedbacks() contain? Is there any available preprocessing script that I can utilize to generate the "_n.txt" directly?

@Coder-Yu
Copy link
Owner

Coder-Yu commented Jan 2, 2020

you can refer to the experimental settings of the paper "Adaptive implicit friends identification over heterogeneous network for social recommendation".

@iYiYaHa
Copy link
Author

iYiYaHa commented Jan 3, 2020

Thanks for your time.

For the latecomers who are interested in the preprocessing of the negative items, I cite the following statements from the above IF-BPR paper:

For Epinions and Douban with a rating scale of 1 to 5, only the ratings of 4 and 5 are considered as the positive feedbacks and the ratings of 1 and 2 are considered as the negative feedbacks for the model training. For LastFM, the songs which were listened only for once by the current user are collected as the negative feedbacks.

@iYiYaHa iYiYaHa closed this as completed Jan 3, 2020
@Coder-Yu
Copy link
Owner

Coder-Yu commented Jan 3, 2020

@walkerjg If you would like the datasets used in our paper, you could leave your mail address and I will send you the datasets when I see the message.

@iYiYaHa
Copy link
Author

iYiYaHa commented Jan 3, 2020

Thanks a lot. I have found the datasets' URLs mentioned by you in a previous issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants