Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out of Memory on Pubmed Dataset #3

Closed
Tiiiger opened this issue Jan 20, 2019 · 6 comments
Closed

Out of Memory on Pubmed Dataset #3

Tiiiger opened this issue Jan 20, 2019 · 6 comments

Comments

@Tiiiger
Copy link

Tiiiger commented Jan 20, 2019

I tried to run the released execute.py on Pubmed. However, it seems that it takes 19.25 GB during back propagation.

Is this the correct behavior? Is there any solution to bypass this problem and replicate the paper reported number?

@PetarV-
Copy link
Owner

PetarV- commented Jan 20, 2019

Did you reduce the feature size to 256, as the paper reports?

@Tiiiger
Copy link
Author

Tiiiger commented Jan 20, 2019

Sorry for being stupid; I didn't change it to be 256.

After I change the batch size to 256, the result I get on Pubmed is $78.47 \pm{0.66}$, much higher than the paper reported number. All I did is to copying the pubmed data into /data and changing the batchsize. I think you might want to check this and maybe update the reported number to be better.

Again, thank you for sharing the codes! It would also be very nice if you can share the implementation of DGI on Reddit (in Tensorflow or something; whatever works).

@PetarV-
Copy link
Owner

PetarV- commented Jan 20, 2019

Is your result single-run, or averaged over multiple runs? I'm not ruling anything out, but it could always be due to PyTorch versions.

@Tiiiger
Copy link
Author

Tiiiger commented Jan 20, 2019

This is averaged over 10 runs. And I am using pytorch 1.0.

@PetarV-
Copy link
Owner

PetarV- commented Jan 20, 2019

You could've gotten lucky -- try 50, as described in the paper.

But yeah, it could well be PyTorch version, the one I've used for the experiments is 0.4.0. I might try to re-run with an upgraded stack, when my schedule clears up a little bit. :'(

In either case, thanks for taking the effort to re-run this experiment and notifying of the outcome!

@Tiiiger
Copy link
Author

Tiiiger commented Jan 20, 2019

Great! I am closing this.

@Tiiiger Tiiiger closed this as completed Jan 20, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants