Skip to content

Collection of resources about partial differential equations, graph neural networks, deep learning and dynamical system simulation

Notifications You must be signed in to change notification settings

kaist-silab/awesome-graph-pde

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 

Repository files navigation

📑 Awesome Graph PDE Awesome List

A collection of resources about partial differential equations, deep learning, graph neural networks, dynamic system simulation.

We also roughly categorize the resources into the following categories under "contents" - note that this is a work in progress and relies on contributions. Feel free to suggest additions via Issues or Pull Requests!

🗂 Contents

  • Graph Neural Network Models

  • Physics System Simulation

📖 Theory

Graph Neural Network Models

Graph Element Networks: adaptive, structured computation and memory. ICML19.

link Arxiv ICML19 Github

Traditional applications of GNNs assume an a priori notion of entity (such as bodies, links or particles) and match every node in the graph to an entity. We propose to apply GNNs to the problem of modeling transformations of functions defined on continuous spaces, using a structure we call graph element networks (GENs). Inspired by finite element methods, we use graph neural networks to mesh a continuous space and define an iterative computation that propagates information from some sampled input values in the space to an output function defined everywhere in the space.


Deep Graph Infomax. ICLR19.

link Arxiv ICLR21 Github

We present Deep Graph Infomax (DGI), a general approach for learning node representations within graph-structured data in an unsupervised manner. DGI re- lies on maximizing mutual information between patch representations and corre- sponding high-level summaries of graphs—both derived using established graph convolutional network architectures.


Multipole Graph Neural Operator for Parametric Partial Differential Equations. NeurIPS20.

link Arxiv Nips20 Github

Inspired by the fast multipole method (FMM), we propose a novel hierarchical, and multi-scale graph structure which, when deployed with GNNs, captures global properties of the PDE solution operator with a linear time-complexity.


Neural Operator: Graph Kernel Network for Partial Differential Equations. ICLR20.

link Arxiv ICLR20 Github

We introduce the concept of Neural Operator and instantiate it through graph kernel networks, a novel deep neural network method to learn the mapping between infinite dimensional spaces of functions defined on bounded open subsets of R^d. Unlike existing methods, our approach is demonstrably able to learn the mapping between function spaces, and is invariant to different approximations and grids.


Continuous Graph Flow. ICLR20

link Arxiv ICLR20 Github

In this paper, we introduce a new class of models – Continuous Graph Flow (CGF): a graph gener- ative model based on continuous normalizing flows that generalizes the message passing mechanism in GNNs to continuous time. Specifically, to model continuous time dynamics of the graph variables, we adopt a neural ordinary different equation (ODE) formulation.


GRAND: Graph Neural Diffusion. ICML21

link Arxiv ICML21 Github

We present Graph Neural Diffusion (GRAND) that approaches deep learning on graphs as a continuous diffusion process and treats Graph Neural Networks (GNNs) as discretisations of an underlying PDE. In our model, the layer structure and topology correspond to the discretisation choices of temporal and spatial operators. Our approach allows a principled development of a broad new class of GNNs that are able to address the common plights of graph learning models such as depth, oversmoothing, and bottlenecks. Key to the success of our models are stability with respect to perturbations in the data and this is addressed for both implicit and explicit discretisation schemes. We develop linear and nonlinear versions of GRAND, which achieve competitive results on many standard graph benchmarks.


Beltrami Flow and Neural Diffusion on Graphs. NeurIPS21

link Arxiv NIPS21 Github

We propose a novel class of graph neural networks based on the discretised Beltrami flow, a non-Euclidean diffusion PDE. In our model, node features are supplemented with positional encodings derived from the graph topology and jointly evolved by the Beltrami flow, producing simultaneously continuous feature learning and topology evolution. The resulting model generalises many popular graph neural networks and achieves state-of-the-art results on several benchmarks.


Physics System Simulation

Combining Differentiable PDE Solvers and Graph Neural Networks for Fluid Flow Prediction. ICML20.

link Arxiv ICML20 Github

In this paper, we explore a hybrid approach that combines the benefits of (graph) neural networks for fast predictions, with the physical realism of an industry-grade CFD simulator.


Learning to Simulate Complex Physics with Graph Networks. ICML20.

link Arxiv ICML20 Github

We present a powerful machine learning framework for learning to simulate complex systems from data “Graph Network-based Simulators” (GNS). Our framework imposes strong inductive biases, where rich physical states are represented by graphs of interacting particles, and complex dynamics are approximated by learned message-passing among nodes.


Learning Mesh-Based Simulation with Graph Networks. ICLR21.

link Arxiv ICLR21 Github

We introduce a method for predicting dynamics of physical systems, which capitalizes on the advantages of adaptive mesh representations. Our method works by encoding the simulation state into a graph, and performing computations in two separate spaces: the mesh-space, spanned by the simulation mesh, and the Euclidean world-space in which the simulation manifold is embedded. By passing messages in mesh-space, we can approximate differential operators that underpin the internal dynamics of most physical systems.


Learning continuous-time PDEs from sparse data with graph neural networks. ICLR21.

link Arxiv ICLR21 Github

In this paper we propose to learn free-form, continuous-time, a priori fully unknown PDE model F from sparse data measured on arbitrary timepoints and locations of the coordinate domain Ω with graph neural networks (GNN).


Learning the Dynamics of Physical Systems from Sparse Observations with Finite Element Networks. ICLR22

link Arxiv ICLR22 Github

We propose a novel class of graph neural networks based on the discretised Beltrami flow, a non-Euclidean diffusion PDE. In our model, node features are supplemented with positional encodings derived from the graph topology and jointly evolved by the Beltrami flow, producing simultaneously continuous feature learning and topology evolution. The resulting model generalises many popular graph neural networks and achieves state-of-the-art results on several benchmarks.

📬 Feedback

If you have some ideas or any other suggestions, just feel free to rise via Issues or Pull Requests.

📜 License

awesome-graph-pde

About

Collection of resources about partial differential equations, graph neural networks, deep learning and dynamical system simulation

Topics

Resources

Stars

Watchers

Forks