Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Results about unsupervised_TU experiments #10

Closed
lihy96 opened this issue Dec 30, 2020 · 4 comments
Closed

Results about unsupervised_TU experiments #10

lihy96 opened this issue Dec 30, 2020 · 4 comments

Comments

@lihy96
Copy link

lihy96 commented Dec 30, 2020

Hi, @yyou1996

When I run the code of unsupervised_TU experiments with a fixed random seed (e.g., 0), the outputs, including loss, accuracy, etc., may be different every time.

How about your opinion on the issue? Thanks a lot!

@yyou1996
Copy link
Collaborator

Hi @lihy96,

The seed configuration code snippet

def setup_seed(seed):
is my commonly used one and it works in my other code (exact the same randomness each time) but might not in unsupervised_TU. Thus, there should be some other lib randomness we didn't notice, and one of my conjecture might be related to torch_geometric lib. What is your opinion?

To address it, multiple run with mean & std reported is performed in our paper.

@lihy96
Copy link
Author

lihy96 commented Dec 31, 2020

Hi, @yyou1996

Thanks for your reply. I run the code (unsupervised_TU) after setting a fixed random seed, and print the loss. I find that the loss of first few epochs is nearly equal among several runnings, but can be very different after some epochs, leading to different accuracy on evaluation finally. Can “accumulative error” be a possible reason for this issue?

@yyou1996
Copy link
Collaborator

I am not sure on that. If that is an issue, maybe training for longer time can mitigate it?

@lihy96
Copy link
Author

lihy96 commented Jan 3, 2021

I will try it. Thank you for your suggestions.

@lihy96 lihy96 closed this as completed Jan 3, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants