Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MAGNN training memory consumption #19

Closed
esprit1995 opened this issue Feb 4, 2021 · 2 comments
Closed

MAGNN training memory consumption #19

esprit1995 opened this issue Feb 4, 2021 · 2 comments

Comments

@esprit1995
Copy link

Hello,

First of all, thank you for your paper and your code, it is a big help for anyone interested in HNE problem.
I have a question regarding MAGNN training. I tried fitting it to unattributed PubMed (smallest dataset of 4) in an unsupervised fashion. However, I couldn't do it - after the training started, all 32 gigs of RAM I have available were taken and then the script crashed.
I didn't change MAGNN parameters after cloning the repo and ran the data transform stage as indicated in readme. My question is, am I doing something wrong? Not entirely sure how 118 MB dataset and 2-layer net could do this. How much memory did you need for this task while obtaining the results for the paper?
I have seen in the closed issues that someone else has run into the same problem but there was no definite resolution there :(
Please help :)

@xiaoyuxin1002
Copy link
Collaborator

The data transform stage identifies all the 1-hop and 2-hop meta paths by default. Since there are 10 different link types in PubMed, there will be a relatively huge number of meta paths used for training. You may wish to do some selection on those meta paths before feeding them into MAGNN. I would suggest you start with a total of 10 different meta paths and see if that would solve your problem.

@esprit1995
Copy link
Author

That helped, thanks. I left 1-hop metapaths to start with.
Could you share the metapaths you used to obtain the results presented in the paper, please?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants