You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I was trying to reproduce the synthetic graph experiments and I found what look like two bugs - both in data.py: https://github.com/jrwnter/pigvae/blob/c3dfcef252f2bf7d34ee4c8dca2ca5a605fa894b/pigvae/synthetic_graphs/data.py#L159C13-L169
When dm is converted to .long() on line 164, it seems like infinite entries in dm (indicating two unconnected nodes) turn into very large negative numbers. This causes the subsequent clamping to turn them into zeros. The result is that both unconnected nodes and self-connections are coded as the first entry in the one-hot vectors. I don't know how big of a problem this is, but it's relatively straightforward to fix by moving the conversion to 'long' after the clamping:
Another issue is that after num_nodes is reassigned on line 166, it always corresponds to the maximum number of nodes. This results in the mask always containing True exclusively. This is also straightforward to fix like this (i.e. without reassigning num_nodes):
Hi, I was trying to reproduce the synthetic graph experiments and I found what look like two bugs - both in
data.py
:https://github.com/jrwnter/pigvae/blob/c3dfcef252f2bf7d34ee4c8dca2ca5a605fa894b/pigvae/synthetic_graphs/data.py#L159C13-L169
When
dm
is converted to.long()
on line 164, it seems like infinite entries indm
(indicating two unconnected nodes) turn into very large negative numbers. This causes the subsequent clamping to turn them into zeros. The result is that both unconnected nodes and self-connections are coded as the first entry in the one-hot vectors. I don't know how big of a problem this is, but it's relatively straightforward to fix by moving the conversion to 'long' after the clamping:Another issue is that after
num_nodes
is reassigned on line 166, it always corresponds to the maximum number of nodes. This results in the mask always containingTrue
exclusively. This is also straightforward to fix like this (i.e. without reassigning num_nodes):Please let me know if I misunderstood something and these are not actually bugs!
The text was updated successfully, but these errors were encountered: