Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Metagraphs.jl integraion #6

Closed
yuehhua opened this issue Nov 5, 2019 · 12 comments
Closed

Metagraphs.jl integraion #6

yuehhua opened this issue Nov 5, 2019 · 12 comments
Labels
enhancement New feature or request

Comments

@yuehhua
Copy link
Member

yuehhua commented Nov 5, 2019

So far, Metagraph and related objects are not available to put into the model directly.
We should provide a way to let user specify which node/edge features to be used in model.

@yuehhua yuehhua added the enhancement New feature or request label Nov 5, 2019
@yuehhua
Copy link
Member Author

yuehhua commented Nov 21, 2019

Accepting constructing layers from MetaGraph #10

@rkurchin
Copy link

I'd be very interested in this feature. Since I can imagine it takes awhile to implement (I was playing around with trying to do it myself but a lot about MetaGraphs seems to work pretty differently from e.g. SimpleWeightedGraphs), I'm curious for interim solutions – in particular, I'd like to do something akin to the GCNConv layer but be able to accept different graphs (i.e. adjacency matrices) for which the input matrices would have different dimensions. Do you have thoughts about the best way to build a model of that type at the moment? Most of the examples in Flux that I've come across have only one vector (or matrix) of input.

@yuehhua
Copy link
Member Author

yuehhua commented Mar 26, 2020

@rkurchin Thank you for appreciating this project. I am willing to make something different. However, I am curious. What do you mean by a layer which is able to accept different graphs? What purpose are these different graphs for? Could you give more detail about this?

@rkurchin
Copy link

Sure, apologies if it wasn't clear. Basically, I'm trying to make a Julia implementation of Crystal Graph Convolutional Neural Nets. In these models, the inputs are different crystal structures (represented as graphs, with feature vectors for each node representing atomic features) and the outputs are quantities we wish to predict. Does this clarify? I actually have an almost working (very) preliminary version right now that I plan to post on Github as soon as I debug a few last things with the actual training of the network, which will hopefully make it even more clear what I'm looking for.

@yuehhua
Copy link
Member Author

yuehhua commented Mar 27, 2020

I have seen your post on Julia discourse. I am kind of understand what you want. So, in other words, You want a layer accepts two streams, one for feature and the other for graph structure. So that the graph with in layer can be changed dynamically, and adapted with the same layer parameter. Is that right?

@rkurchin
Copy link

Yes. The way I've been trying to implement it is that the layer accepts two inputs: one for the adjacency matrix of the graph (say, dimensions n x n), and another for the feature matrix representing the features of each node (say, dimensions f x n). I've been talking to Dhairya Gandhi about this as well and he's going to help me try to debug some things – if you'd like to have a look I can send you the code too.

@yuehhua
Copy link
Member Author

yuehhua commented Mar 27, 2020

Is this what you want? #31
If yes, I will give it some test quickly and merge.

@yuehhua
Copy link
Member Author

yuehhua commented Mar 28, 2020

I merged #31, You may try in master branch.

@rkurchin
Copy link

So this doesn't work for a few reasons.

  1. The bias is the wrong dimensions – it needs to be a vector of dimensions (# features, 1) and then added to every feature, otherwise there's a dimension mismatch if you try to train it as a matrix. I think the action of the layer should look something like
    And then the action of the layer should be something like (I had it accept a tuple so that it can return one too and then you can chain several of these layers together if you want to)
(l::CGCNConv)(input::Tuple{Array{Bool,2},SparseMatrixCSC{Float32,Int64}}) = l.σ.(l.weight * input[1] * normalized_laplacian(input[2]+I, Float32) + hcat([l.bias for i in 1:size(input[2], 1)]...)), input[2]

except for I've been having problems with type inference with that hcat thing, which I haven't figured out how to fix yet.

  1. The constructor should probably be rewritten too to not need a graph to be fed in. In the version I was attempting, mine looks more like:
function CGCNConv(ch::Pair{<:Integer,<:Integer}, σ=softplus; init=glorot_uniform, T::DataType=Float32, bias::Bool=true)
    weight = init(ch[2], ch[1])
    b = bias ? init(ch[2], 1) : zeros(T, ch[2], 1)
    CGCNConv(weight, b, σ)

But with my current solution (even if I turn off the bias) I'm unable to get the model to train; I think there's still some kind of type inference issue that I haven't been able to sort out. I made a simple example with the application I've been working on, for which I'm attaching the files here if you're curious.

The simple_graphcon.jl file in here is the one you want to run, it’ll import a small subset of the data I’m working with that I’ve packaged up in JLD files. I’ve demonstrated the behaviors I can’t figure out at the end of the file with some comments for elaboration, and instructions to reproduce the hcat issue also.

CGCNN_MWE.zip

@rkurchin
Copy link

rkurchin commented Apr 1, 2020

Okay, Dhairya fixed it; I needed to add the following:

@adjoint function SparseMatrixCSC{T,N}(arr) where {T,N}
  SparseMatrixCSC{T,N}(arr), Δ -> (collect(Δ),)
end
@nograd LinearAlgebra.diagm

@adjoint function Broadcast.broadcasted(Float32, a::SparseMatrixCSC{T,N}) where {T,N}
  Float32.(a), Δ -> (nothing, T.(Δ), )
end
@nograd issymmetric

@yuehhua
Copy link
Member Author

yuehhua commented Apr 11, 2020

I separate the input graph from the layer in #34. Input graph and features are kept in the new type called FeaturedGraph and the layer accepts FeaturedGraph as input.

@rkurchin
Copy link

ohhh this makes a ton of sense – will try this out in the next few days, thanks a lot!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants