Learning in infinite dimension with neural operators.
-
Updated
Jul 19, 2024 - Python
Learning in infinite dimension with neural operators.
This repository is the official implementation of the paper Convolutional Neural Operators for robust and accurate learning of PDEs
A library for Koopman Neural Operator with Pytorch.
No need to train, he's a smooth operator
Datasets and code for results presented in the BOON paper
A multiphase multiphysics dataset and benchmarks for scientific machine learning
Efficient, Accurate, and Streamlined Training of Physics-Informed Neural Networks
Codomain attention neural operator for single to multi-physics PDE adaptation.
Official implementation of the NeurIPS 23 spotlight paper of ♾️InfGCN♾️.
Neural Operators with Applications to the Helmholtz Equation
Automatic Functional Differentiation in JAX
This repository contains the code for the paper: Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation
Official implementation of Scalable Transformer for PDE surrogate modelling
The first global synthetic dataset for physics-ML seismic wavefield modeling and full-waveform inversion
Code for ICML 24 paper "Implicit Representations via Operator Learning"
Positron's Milky Way Energy Loss using Operator learning
Implementation of neural operator papers in PyTorch for easier usage. Achieve SOTA in PDE prediction.
Using FNO to learning elasticity model of composite materials
Add a description, image, and links to the neural-operator topic page so that developers can more easily learn about it.
To associate your repository with the neural-operator topic, visit your repo's landing page and select "manage topics."