You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
According to the example code, it needs to compute the distance between arbitrary node pairs, which is O(n^2) complexity and leads to OOM. How can it scale to the large-scale dataset on Reddit?
Thank you~
The text was updated successfully, but these errors were encountered:
Thank you for your attention on our work!
In fact, our method does not need to use the distance of all node pairs in the same time.
Therefore, to solve the out of memory problem when the machine can not afford,we just use the data when needed. Save the result of each node In the disk and read it when needed, or recompute the result when needed may be helpful.
According to the example code, it needs to compute the distance between arbitrary node pairs, which is O(n^2) complexity and leads to OOM. How can it scale to the large-scale dataset on Reddit?
Thank you~
The text was updated successfully, but these errors were encountered: