Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Message Passing Layer of PointGNN #6194

Merged
merged 15 commits into from
Dec 13, 2022
Merged

Conversation

andreasbinder
Copy link
Contributor

I closed this pull request by mistake and solved the conflicts

@codecov
Copy link

codecov bot commented Dec 9, 2022

Codecov Report

Merging #6194 (ffc1d07) into master (90de72f) will increase coverage by 0.02%.
The diff coverage is 100.00%.

❗ Current head ffc1d07 differs from pull request most recent head c2c2ccb. Consider uploading reports for the commit c2c2ccb to get more accurate results

@@            Coverage Diff             @@
##           master    #6194      +/-   ##
==========================================
+ Coverage   84.51%   84.53%   +0.02%     
==========================================
  Files         372      373       +1     
  Lines       20805    20833      +28     
==========================================
+ Hits        17584    17612      +28     
  Misses       3221     3221              
Impacted Files Coverage Δ
torch_geometric/nn/conv/__init__.py 100.00% <100.00%> (ø)
torch_geometric/nn/conv/point_gnn_conv.py 100.00% <100.00%> (ø)

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@rusty1s rusty1s changed the title V2: Message Passing Layer of PointGNN Message Passing Layer of PointGNN Dec 9, 2022
Copy link
Member

@rusty1s rusty1s left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this looks great. Can we also add a simple test for this?

@andreasbinder
Copy link
Contributor Author

Hi @rusty1s ! I am sorry to bother you but I am not sure how to write the jittable part of the test (or am I even supposed to run that part, I removed the if clause in my local test).

x = torch.randn(6, 3)
edge_index = torch.tensor([[0, 1, 1, 1, 2, 5], [1, 2, 3, 4, 3, 4]])
row, col = edge_index
adj = SparseTensor(row=row, col=col, sparse_sizes=(6, 6))
MLP_h = MLP([3, 64, 3])
MLP_f = MLP([6, 64, 3])
MLP_g = MLP([3, 64, 3])

conv = PointGNNConv(state_channels=3, MLP_h=MLP_h, MLP_f=MLP_f, MLP_g=MLP_g)
assert conv.__repr__() == 'PointGNNConv(3, lin_h=MLP(3, 64, 3), lin_f=MLP(6, 64, 3),lin_g=MLP(3, 64, 3))'
out = conv(x, x, edge_index)

assert out.size() == (6, 3)
assert torch.allclose(conv(x, x, adj.t()), out, atol=1e-6)

if True:
    t = '(Tensor, Tensor, Tensor) -> Tensor'
    jit = torch.jit.script(conv.jittable(t))
    assert jit(x, x, edge_index).tolist() == out.tolist()

    t = '(Tensor, Tensor, SparseTensor) -> Tensor'
    jit = torch.jit.script(conv.jittable(t))
    assert torch.allclose(jit(x, x, adj.t()), out, atol=1e-5)

I could not infer the error from the trace:
image
I can attach the whole trace if you think it helps. Thank you very much!

@rusty1s
Copy link
Member

rusty1s commented Dec 13, 2022

Don't worry about it, I can take care of it. Do you want to push your tests so that I can take a look?

@andreasbinder
Copy link
Contributor Author

Amazing! I added the test file (and some annotations to the conv file) @rusty1s

@rusty1s rusty1s enabled auto-merge (squash) December 13, 2022 16:41
@rusty1s rusty1s merged commit 66f3736 into pyg-team:master Dec 13, 2022
@andreasbinder
Copy link
Contributor Author

Thank you for the useful changes @rusty1s ! Just out of interest, do you know why the jittable test failed (so that I know in the future)? Thank you!

@rusty1s
Copy link
Member

rusty1s commented Dec 17, 2022

I am not so sure anymore. I think it had something do to with the missing size=None in propagate which we sadly currently need for TorchScript support :(

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants