-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inductive learning experiments in your paper #14
Comments
The performance reported in Table III corresponds to the results under the transductive setting. We didn't report the results under the inductive setting in the paper. However, we provide the source code of MMGL under the inductive setting. In practice, we are glad to see that the performance of MMGL under the inductive setting is also very strong, similar to the results under the transductive setting in Table III. |
This comment was marked as resolved.
This comment was marked as resolved.
Hey. I was wondering how the edge_weights computed by your method are used by the MultiLayerNeighborSampler to build each computation graph for each node in a batch? I am not familiar with the MultiLayerNeighborSampler, but from what you say in the paper, it performs the Neighbor sampling as proposed in neighbor sampling [paper](Inductive Representation Learning on Large Graphs). Are the edge weights considered for elaboration of the computation graphs? |
I'm sorry for not getting back to you sooner! To my knowledge, the edge_weights do not influence the sampling process of NeighborSampler unless we set the sample prob of NeighborSampler as corresponding to edge_weights. In this work, the edge weights only participate in the message passing process. |
Hey! Great work. I was analyzing your paper, and couldn't figure out if you had reported results from inductive learning experiments. Are Table III results from inductive experiments?
Best.
The text was updated successfully, but these errors were encountered: