You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for sharing the code! When I use other nodes in the DBLP dataset for preprocessing to search for meta path instances, such as some operations like the preprocess_DBLP.ipynb file. But it has always required a large amount of memory. Do the author know the reason for this?
The text was updated successfully, but these errors were encountered:
Yes. preprocess_DBLP.ipynb does require a large amount of memory. You may just download and use the preprocessed dataset. Or you can revise the get_metapath_neighbor_pairs function to do some sub-sampling and reduce the dataset
Thanks for sharing the code! When I use other nodes in the DBLP dataset for preprocessing to search for meta path instances, such as some operations like the preprocess_DBLP.ipynb file. But it has always required a large amount of memory. Do the author know the reason for this?
The text was updated successfully, but these errors were encountered: