New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Contributing more pooling and convolution layers #49
Comments
Hi, I'd be happy to add your code to the framework. I should probably improve the contribution guidelines, here's more or less what it would be good to have:
Tests are a bit different. It's better if you copy-paste what I did for other methods, or just open a PR and I'll add the test cases myself. I've never had big PRs so it's better if we work together on this one :D Cheers, |
Okay, wonderful! Best, |
ENH (issue #49): Added SortPool layer to spectral.layers.pooling.globalpool
Is there any place where I can put utility layers used for convolutional layers without them cluttering the convolutional layer docs? |
Sure! If it's a proper Layer you can put it in |
ENH (issue #49): Added DiffusionConvolution, tests and a script to easily build docs locally.
Alright, I believe all my implementations have now been merged into the development branch and this issue can be closed. Now that I feel comfortable with this framework, I'd really like to contribute more. Is there any to-do list I can consult to help with further development? |
Thanks again for contributing! I will be merging the new layers into I am also starting a TODO list in Projects to keep track of what is missing in the library. Cheers, |
FYI I have just pushed to |
Very nice! Out of curiosity, is disjoint mode supposed to be an internal data storage format, mainly served by spektral's data loader etc or do we expect many users to have their own data in that format? Because apart from TU, I am not aware of many datasets having that format. Furthermore, I expect that many users will have batch-mode signal and adjacency data with the node dimension being ragged. It may be worth providing methods that allow to work with graph datasets that have a ragged node dimension. Best, |
Disjoint mode solves exactly the problem of having to work with ragged tensors. I have looked into tf.ragged several times since I started this project, but there's always some problem that makes it impossible to move the whole framework to a ragged representation of graphs. This way of representing graph is not only limited to the TU data, but is a fairly common way of representing graphs (also sometimes called ad "edge list"... it's all the same, really). The new OGB datasets are represented like this, for instance, and also many network datasets found online. I think it's pretty convenient, and I don't see a clear advantage of batch mode over disjoint, especially because it is impossible to store batch data as sparse matrices. So let's say that you have two graphs with a different number of nodes, stored in lists of matrices: A_list = [A_1, A_2]
X_list = [X_1, X_2] You can convert them to batch mode (all graphs have the same size): from spektral.utils.data import numpy_to_batch
A_batch, X_batch = numpy_to_batch(A_list, X_list) Or you can convert them to disjoint mode: from spektral.utils.data import numpy_to_disjoint
A_disjoint, X_disjoint, I_disjoint = numpy_to_disjoint(A_list, X_list) Note that I hope that answers your doubts. Cheers, |
…alpool - Added the global pooling layer "SortPool" that keeps the top \(k\) most relevant nodes as described by (Zhang et al.)[https://www.cse.wustl.edu/~muhan/papers/AAAI_2018_DGCNN.pdf]. For now it only supports single and batched mode. - Covered it in standard pooling tests (tests/test_layers/test_pooling.py) without adding any specific tests. - Added .cache to .gitignore -- its an irrelevant folder that pytest sometimes creates when it fails a test. - Added reference to SortPool in README.md.
ENH (issue #49): Added SortPool layer to spectral.layers.pooling.globalpool
…sily build docs locally. - Added DiffusionConvolution to `layers.convolutional`. Also covered it in tests and made sure the docs look good. - Improved docs for SortPool and covered it in globalpool tests (removed from pool tests). - Added both SortPool and DiffusionConvolution to `autogen.py` so it is contained in the docs. - added `docs/local_build.sh` which is a script that installs the current spektral build from the local branch, builds docs and serves them on a local webserver. For convinience.
ENH (issue #49): Added DiffusionConvolution, tests and a script to easily build docs locally.
Hey,
I have implementations of the following few graphical neural network components:
Convolutions:
Pooling:
I would love to add these to your framework however, I am a bit lost with all the requirements that my implementations need to fulfil in order to integrate seamlessly. Is there any guide for what tests, features or properties layers need to have?
Best,
Levi
The text was updated successfully, but these errors were encountered: