Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transfer Learning #20

Closed
DuanhaoranCC opened this issue Nov 19, 2022 · 1 comment
Closed

Transfer Learning #20

DuanhaoranCC opened this issue Nov 19, 2022 · 1 comment

Comments

@DuanhaoranCC
Copy link

Hi and thanks for your work!
What is the difference between GraphMAE and the baseline model AttrMask in transfer learning?

@THINK2TRY
Copy link
Contributor

THINK2TRY commented Nov 20, 2022

@DuanhaoranCC Thanks for your attention. There are two differences between GraphMAE and AttrMask:

  1. GraphMAE uses a GNN as the decoder, and AttrMask uses a linear layer for decoding.
  2. Scaled cosine error (SCE) is used as the criterion in GraphMAE rather than cross-entropy in AttrMask.

We haven't conducted rigorous ablation studies to measure the single contribution of each component, but the overall training paradigm of GraphMAE achieves better results than AttrMask.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants