-
Notifications
You must be signed in to change notification settings - Fork 691
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Minus sign in front of leaky relu #16
Comments
I think it is wrong. No need to add this. How do you think? |
Yes, it is wrong. I want to avoid numerical instability. Instead of minus sign, you need to subtract the max value of tensor like logsumexp. |
Hey @sh0416 , sorry but what do you mean by "subtract the max value of tensor like logsumexp" |
Hi @sh0416 , could you give a modification of the code? |
I read the code and concluded that just removing the minus sign will just work. |
Hi! @sh0416 I am new in deep-learning. I find that there will be inf during torch.exp(self.leakyrelu(self.a.mm(edge_h).squeeze())) (without the minus sign). Do you think the minus necessary for for numerical stability?Or we should switch to another function? |
Hello,
In your implementation of SpGAT,
there is this line:
edge_e = torch.exp(-self.leakyrelu(self.a.mm(edge_h).squeeze()))
However, I cannot understand why you added the minus sign in front of the leak relu operation.
Is that right?
The text was updated successfully, but these errors were encountered: