Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some questions about the backbone model #6

Open
bdy9527 opened this issue Jan 25, 2021 · 1 comment
Open

Some questions about the backbone model #6

bdy9527 opened this issue Jan 25, 2021 · 1 comment

Comments

@bdy9527
Copy link

bdy9527 commented Jan 25, 2021

I noticed that the backbone model used for the regularization based GNNs is GCN. While GRAND seems to use a mixed-order propagation backbone. Is this a fair comparison? I wonder if GRAND benefits a lot from the large receptive field.

@wzfhaha
Copy link
Collaborator

wzfhaha commented Jan 25, 2021

GRAND adopts various techniques to promote the performance on this task. Mixed-order propagation is one of the components of GRAND, which can reduce over-smoothing for two reasons: 1) focusing more on local information, 2) removing non-linear transformation between layers. Employing this propagation to perform random data augmentation is also a contribution of this work. And the results of other regularization methods are directly taken from their original paper for convenience. I think combine other regularization methods (e.g. mixup) with this propagation rule is a good research direction, you can have a try if you have interests :).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants