Skip connections for Graph Attention Network #290
Answered
by
danielegrattarola
PietroNardelli
asked this question in
Q&A
-
Thanks for all your great work! I am trying to implement a GAT network for inductive learning (something similar to what is done for the PPI dataset in the original paper) and I was wondering if skip connections are already implemented inside the GATConv layer. If not, where should the skip connections go? Thanks a lot! |
Beta Was this translation helpful? Give feedback.
Answered by
danielegrattarola
Oct 4, 2021
Replies: 1 comment
-
Hi, there are no skip connections in the GAT layer, so you'll have to implement them manually. Something like: x, a = inputs
x_new = gat([x, a])
x_out = tf.concat([x_new, x], -1) Cheers |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
PietroNardelli
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi,
there are no skip connections in the GAT layer, so you'll have to implement them manually.
Something like:
Cheers