New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
subgraphx example ipynb code was not worked #133
Comments
Hello, thank you for your issue. The increase of input features doesn't influence the training time a lot. For the ba_shapes dataset, we only explain the target nodes which are in the house-like motif. Because the explanations of nodes in the base graph don't have meaningful patterns. In addition, the node in the base graph usually has a high degree which means the subgraph is larger and needs more time for SubgraphX to explore. |
If it is not the case, welcome to post more details. |
Ok thank you for answering. We modified the model like below scrip for testing our model. class GCN_3l(GNNBasic):
we trained the model using upper model in bashapes |
I checked that the code did not work when I added gcn3l instead of gcn2l to the model. It is so slow that it does not seem to work. |
Hello, thank you for your issue. When you use a 3-layer GCN, the 3-hop neighbors in ba_shapes usually contain many nodes, which causes a huge time consumption to search the target subgraphs. |
Hello, my name is kyeongrok. I am student of south korea.
Now, I am trying to work the examples code of subgraph x.(subgraphx.ipynb)
But it was not working.
In your code, the dimension of bashpaes graph's feature is just one.
So to use my model(it is same with gcnl3 in models.py) and data(it is same with bashapes in dig dataset, but has 10 dimension feature), i just deleted the 'dataset.data.x = dataset.data.x[:, :1]' line in second block
And i modified the gcn3l model a little to work in 10 dimension input.
However, subgraphx spend a lot of time(It seems not to work)
I would like to ask if there is a possibility that the subgraphx may not work as the input feature of the data increases.
Thank you for reading.:)
The text was updated successfully, but these errors were encountered: