Skip to content

Skip connections for Graph Attention Network #290

Discussion options

You must be logged in to vote

Hi,

there are no skip connections in the GAT layer, so you'll have to implement them manually.

Something like:

x, a = inputs
x_new = gat([x, a])
x_out = tf.concat([x_new, x], -1)

Cheers

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by PietroNardelli
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants