Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

subgraphx example ipynb code was not worked #133

Closed
Park-kyeong-rok opened this issue Jul 31, 2022 · 5 comments
Closed

subgraphx example ipynb code was not worked #133

Park-kyeong-rok opened this issue Jul 31, 2022 · 5 comments

Comments

@Park-kyeong-rok
Copy link

Hello, my name is kyeongrok. I am student of south korea.

Now, I am trying to work the examples code of subgraph x.(subgraphx.ipynb)
But it was not working.
In your code, the dimension of bashpaes graph's feature is just one.
So to use my model(it is same with gcnl3 in models.py) and data(it is same with bashapes in dig dataset, but has 10 dimension feature), i just deleted the 'dataset.data.x = dataset.data.x[:, :1]' line in second block
And i modified the gcn3l model a little to work in 10 dimension input.

However, subgraphx spend a lot of time(It seems not to work)
I would like to ask if there is a possibility that the subgraphx may not work as the input feature of the data increases.

Thank you for reading.:)

@Oceanusity
Copy link
Collaborator

Hello, thank you for your issue. The increase of input features doesn't influence the training time a lot. For the ba_shapes dataset, we only explain the target nodes which are in the house-like motif. Because the explanations of nodes in the base graph don't have meaningful patterns. In addition, the node in the base graph usually has a high degree which means the subgraph is larger and needs more time for SubgraphX to explore.

@Oceanusity
Copy link
Collaborator

If it is not the case, welcome to post more details.

@Park-kyeong-rok
Copy link
Author

Ok thank you for answering.

We modified the model like below scrip for testing our model.
Everything is same but we add the normalize layer.
we trained the model using below model in bashapes and wanted to get a explain using subgraphx.ipynb.
But it is not working in examples/subgraphx.ipynb

class GCN_3l(GNNBasic):

def __init__(self, model_level, dim_node, dim_hidden, num_classes):
    super().__init__()
    num_layer = 3

    self.conv1 = GCNConv(dim_node, dim_hidden)
    self.convs = nn.ModuleList(
        [
            GCNConv(dim_hidden, dim_hidden)
            for _ in range(num_layer - 1)
         ]
    )
    self.relu1 = nn.ReLU()
    self.relus = nn.ModuleList(
        [
            nn.ReLU()
            for _ in range(num_layer - 1)
        ]
    )
    if model_level == 'node':
        self.readout = IdenticalPool()
    else:
        self.readout = GlobalMeanPool()

    self.ffn = nn.Seqential(nn.Linear(dim_hidden, num_classes))
   
    

    self.dropout = nn.Dropout()

def forward(self, *args, **kwargs) -> torch.Tensor:
    """
    :param Required[data]: Batch - input data
    :return:
    """
   
    x, edge_index, batch = self.arguments_read(*args, **kwargs)
    edge_weights = torch.ones(edge_index.size(1))

    
    post_conv = self.relu1(torch.nn.functional.normalize(self.conv1(x, edge_index),p=2,dim=1))
    for conv, relu in zip(self.convs, self.relus):
        post_conv = relu(torch.nn.functional.normalize(conv(post_conv, edge_index),p=2,dim=1))

    out_readout = self.readout(post_conv, batch)

    out = self.ffn(out_readout)
    return out

def get_emb(self, *args, **kwargs) -> torch.Tensor:
    x, edge_index, batch = self.arguments_read(*args, **kwargs)
    post_conv = self.relu1(torch.nn.functional.normalize(self.conv1(x, edge_index),p=2,dim=1))
    for conv, relu in zip(self.convs, self.relus):
        post_conv =relu(torch.nn.functional.normalize(conv(post_conv, edge_index),p=2,dim=1))
    return post_conv

we trained the model using upper model in bashapes

@Park-kyeong-rok
Copy link
Author

I checked that the code did not work when I added gcn3l instead of gcn2l to the model. It is so slow that it does not seem to work.

@Oceanusity
Copy link
Collaborator

Hello, thank you for your issue. When you use a 3-layer GCN, the 3-hop neighbors in ba_shapes usually contain many nodes, which causes a huge time consumption to search the target subgraphs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants