You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I just found this library and am excited to start using it! I really appreciate the examples and great documentation. After reading, I still have one question: do you have any advice on modeling dynamically sized graphs (where the node count varies from graph to graph)?
My naive instinct is to train a network with a large N and zero-pad the unused elements when the number of nodes is < N. I am not sure if the zeros should be at the beginning, end, or just anywhere. I am interested to hear your input.
Thanks in advance for any advice!
-Jack
The text was updated successfully, but these errors were encountered:
If the graphs are small enough to be zero-padded, then you can use spektral.utils.numpy_to_batch to do exactly what you suggest (this is called "batch mode" in the documentation). Note that this will waste a lot of memory because you will have dense arrays with a lot of zeros.
Another possibility is to use "disjoint mode", where you take the disjoint union of your graphs as if they were a single graph. This saves more memory because the adjacency matrix can be a sparse tensor, and also avoids the zero-padding.
Use spektral.utils.numpy_to_disjoint to convert your graphs to the single graph.
Hi @danielegrattarola ,
I just found this library and am excited to start using it! I really appreciate the examples and great documentation. After reading, I still have one question: do you have any advice on modeling dynamically sized graphs (where the node count varies from graph to graph)?
My naive instinct is to train a network with a large N and zero-pad the unused elements when the number of nodes is < N. I am not sure if the zeros should be at the beginning, end, or just anywhere. I am interested to hear your input.
Thanks in advance for any advice!
-Jack
The text was updated successfully, but these errors were encountered: