Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove size parameter from forward methods #3713

Open
saiden89 opened this issue Dec 16, 2021 · 4 comments
Open

Remove size parameter from forward methods #3713

saiden89 opened this issue Dec 16, 2021 · 4 comments
Labels

Comments

@saiden89
Copy link
Contributor

馃洜 Proposed Refactor

Modify all forward methods not to reference size in their signature and arguments, and modify the tests to pass accordingly.

Motivation

Taking hint from e42ac59 and 6c79652 it seems that the size parameter of the forward function is being deprecated, as calling without passing any value still results in the correct propagation of tensors.

Pitch

To the best of my knowledge, there is no need to let the user explicitly set the size, as already many networks can safely infer it based on input values.

Additional context

I'm unsure whether this refactoring can apply to all layers. Please let me know if there are specific use cases where it is best to leave size in the signature.

@rusty1s
Copy link
Member

rusty1s commented Dec 17, 2021

Yes, this is a great point. In general, we only need support for size in case we operate on a bipartite graph in which one of the node features matrices (source or destination) can be None. For example, this is possible in GATConv, where the input can be OptPairTensor = Tuple[Tensor, Optional[Tensor]].

However, the commits indeed suggest that this is not consistent across all PyG operators, so I'm happy to refactor this :)

@saiden89
Copy link
Contributor Author

Cool! Reading from the documentation, currently it seems that the following convolutional layers still use the size:

  • SAGEConv
  • GraphConv
  • GATConv
  • GATv2Conv
  • GINConv
  • GINEConv
  • MFConf
  • GMMConv
  • SplineConv
  • NNConv
  • GENConv
  • GeneralConv
  • HEATConv

As I only recently took an interest towards PyG's technical internals, it would be better to have a list of layers that are to be refactored, leaving the rest as they are.

@rusty1s
Copy link
Member

rusty1s commented Dec 21, 2021

Thank you! I just checked and it looks like only GATv2Conv and HEATConv do not support OptPairTensor while the have support for size in the forward call. Let me know if you have any interest in removing this for these two :)

@saiden89
Copy link
Contributor Author

Of course, that's why I opened the issue :).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants