You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi.
Thanks for this awesome library! I was able to visualize some pretty big datasets (5-10 millions of nodes and 15 millions of links).
So I tried to increase the number of nodes(somewhere around 50 millions).
I have 32GB of RAM btw.
My current issue is that creating a ngraph.graph instance is a bit difficult. I mean because of such a large dataset my PC memory gets full very fast.
So I was wondering if there is a way to save several ngraph.graph instances into files and then merge them (so the result next can be processed with ngraph.tobinary).
Or maybe there is another approach to this issue?
Thanks!
The text was updated successfully, but these errors were encountered:
I'm also wondering about that, yeah. I have several subgraphs that I would like to combine into one mega-graph for pathfinding.
I didn't manage to merge several subgraphs.
Instead I hit a bottleneck - browser per-tab memory.
Basically it means that if generated files are too big the browser wouldn't be able to handle your graph.
Different browsers have different limits. Google Chrome for example can handle up to 4GB data per tab, that's ~14 millions of nodes
Hi.
Thanks for this awesome library! I was able to visualize some pretty big datasets (5-10 millions of nodes and 15 millions of links).
So I tried to increase the number of nodes(somewhere around 50 millions).
I have 32GB of RAM btw.
My current issue is that creating a ngraph.graph instance is a bit difficult. I mean because of such a large dataset my PC memory gets full very fast.
So I was wondering if there is a way to save several ngraph.graph instances into files and then merge them (so the result next can be processed with ngraph.tobinary).
Or maybe there is another approach to this issue?
Thanks!
The text was updated successfully, but these errors were encountered: