Skip to content

Latest commit

 

History

History
116 lines (91 loc) · 6.21 KB

index.rst

File metadata and controls

116 lines (91 loc) · 6.21 KB

DeepXDE

DeepXDE is a library for scientific machine learning and physics-informed learning. DeepXDE includes the following algorithms:

DeepXDE supports five tensor libraries as backends: TensorFlow 1.x (tensorflow.compat.v1 in TensorFlow 2.x), TensorFlow 2.x, PyTorch, JAX, and PaddlePaddle. For how to select one, see Working with different backends.

Documentation: ReadTheDocs

image

image

image

image

Features

DeepXDE has implemented many algorithms as shown above and supports many features:

  • enables the user code to be compact, resembling closely the mathematical formulation.
  • complex domain geometries without tyranny mesh generation. The primitive geometries are interval, triangle, rectangle, polygon, disk, ellipse, star-shaped, cuboid, sphere, hypercube, and hypersphere. Other geometries can be constructed as constructive solid geometry (CSG) using three boolean operations: union, difference, and intersection. DeepXDE also supports a geometry represented by a point cloud.
  • 5 types of boundary conditions (BCs): Dirichlet, Neumann, Robin, periodic, and a general BC, which can be defined on an arbitrary domain or on a point set; and approximate distance functions for hard constraints.
  • different neural networks: fully connected neural network (FNN), stacked FNN, residual neural network, (spatio-temporal) multi-scale Fourier feature networks, etc.
  • many sampling methods: uniform, pseudorandom, Latin hypercube sampling, Halton sequence, Hammersley sequence, and Sobol sequence. The training points can keep the same during training or be resampled (adaptively) every certain iterations.
  • 4 function spaces: power series, Chebyshev polynomial, Gaussian random field (1D/2D).
  • data-parallel training on multiple GPUs.
  • different optimizers: Adam, L-BFGS, etc.
  • conveniently save the model during training, and load a trained model.
  • callbacks to monitor the internal states and statistics of the model during training: early stopping, etc.
  • uncertainty quantification using dropout.
  • float16, float32, and float64.
  • many other useful features: different (weighted) losses, learning rate schedules, metrics, etc.

All the components of DeepXDE are loosely coupled, and thus DeepXDE is well-structured and highly configurable. It is easy to customize DeepXDE to meet new demands.

User guide

user/installation

demos/function demos/pinn_forward demos/pinn_inverse demos/operator user/parallel user/faq

user/research user/cite_deepxde user/team

API reference

If you are looking for information on a specific function, class or method, this part of the documentation is for you.

modules/deepxde modules/deepxde.data modules/deepxde.geometry modules/deepxde.gradients modules/deepxde.icbc modules/deepxde.nn modules/deepxde.nn.jax modules/deepxde.nn.paddle modules/deepxde.nn.pytorch modules/deepxde.nn.tensorflow modules/deepxde.nn.tensorflow_compat_v1 modules/deepxde.optimizers modules/deepxde.utils

Indices and tables

  • genindex
  • modindex
  • search