Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to calculate the gradient of metapath embedding? #15

Closed
sh0416 opened this issue Mar 8, 2019 · 2 comments
Closed

How to calculate the gradient of metapath embedding? #15

sh0416 opened this issue Mar 8, 2019 · 2 comments

Comments

@sh0416
Copy link

sh0416 commented Mar 8, 2019

If we use binary step function, the gradient of step function is always zero.

I think there is no backpropagation for the metapath embedding vector.

However, there is a update in metapath vector and I don't understand the flag "is_deepwalk".

What is that?

@sh0416
Copy link
Author

sh0416 commented Mar 8, 2019

To my knowledge, this implementation update the gradient of sigmoid function to metapath vector no matter what the regularization function we choose.

Is it intended?

@csiesheep
Copy link
Owner

Hi, Thanks for asking. Yes, I use the gradient of sigmoid function to approximate the gradient of step function.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants