We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
If we use binary step function, the gradient of step function is always zero.
I think there is no backpropagation for the metapath embedding vector.
However, there is a update in metapath vector and I don't understand the flag "is_deepwalk".
What is that?
The text was updated successfully, but these errors were encountered:
To my knowledge, this implementation update the gradient of sigmoid function to metapath vector no matter what the regularization function we choose.
Is it intended?
Sorry, something went wrong.
Hi, Thanks for asking. Yes, I use the gradient of sigmoid function to approximate the gradient of step function.
No branches or pull requests
If we use binary step function, the gradient of step function is always zero.
I think there is no backpropagation for the metapath embedding vector.
However, there is a update in metapath vector and I don't understand the flag "is_deepwalk".
What is that?
The text was updated successfully, but these errors were encountered: