Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added Personalized PageRank Diffusion [ppr_diffusion function] #427

Open
wants to merge 7 commits into
base: master
Choose a base branch
from

Conversation

rbSparky
Copy link
Contributor

@rbSparky rbSparky commented Mar 18, 2024

Continues Issue #412, takes inspiration from DGL implementation

This PR introduces a new function, ppr_diffusion. It calculates Personalized PageRank (PPR) diffusion based on the edge weight matrix and updates the graph with new edge weights derived from the PPR matrix.

Implementation:

The function operates in several key steps:

  • Construct Modified Adjacency Matrix (A): Utilizes existing edge weights (w) to build a matrix $A$ representing the graph's structure, adjusted by $(α - 1)$, where $α$ is the damping factor. This adjustment incorporates both direct connections and global graph structure.
A[dst, src] += w[idx]
A .*= (alpha_f32 - 1)
  • Incorporate Identity Matrix (I): Adds the identity matrix to A to account for self-transitions, forming $I + (α - 1) \times A$.
A[i, i] += 1
  • Apply PPR Formula: Calculates the PPR diffusion matrix as $α \times (I + (α - 1) \times A)^{-1}$, which effectively models the probability of transitioning from one node to another, considering random jumps governed by α.
PPR = alpha_f32 * inv(A)
  • Update Edge Weights: Assigns new weights to each edge based on the corresponding values in the PPR matrix, effectively updating the graph's edge weights to reflect the calculated PPR diffusion.
new_w = [PPR[dst, src] for (src, dst) in zip(s, t)]

@rbSparky rbSparky changed the title Added Personalized PageRank (PPR) Diffusion [ppr_diffusion function] for Edge Weight Adjustment Added Personalized PageRank Diffusion [ppr_diffusion function] Mar 18, 2024
@rbSparky
Copy link
Contributor Author

rbSparky commented Mar 19, 2024

Tests fail for

GATv2Conv: Test Failed at /home/runner/work/GraphNeuralNetworks.jl/GraphNeuralNetworks.jl/test/test_utils.jl:203

This should be checked, since this did not fail in the last commit and the only change in this commit was adding the reference paper.

@rbSparky
Copy link
Contributor Author

Tests pass now, so the previous test fail was unrelated to this PR.

That should also be checked later.

uses SparseArrays
@rbSparky
Copy link
Contributor Author

rbSparky commented May 9, 2024

Would this be GPU compatible now that non mutable operations are removed? Tested training and Zygote no longer gives single element operation error!

@CarloLucibello
Copy link
Owner

Let's have an implementation which works correctly with dense matrices on cpu and in later PRs we can add gpu support and add the sparse matrices performance optimization. So we are mostly fine with the current PR.

@rbSparky
Copy link
Contributor Author

Let's have an implementation which works correctly with dense matrices on cpu and in later PRs we can add gpu support and add the sparse matrices performance optimization. So we are mostly fine with the current PR.

Should I change the code back to older commits where only dense matrices were used? Or would this be ok as it is?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants