ProbDiffEq implements adaptive probabilistic numerical solvers for initial value problems.
It inherits automatic differentiation, vectorisation, and GPU capability from JAX.
Features include:
- ⚡ Calibration and step-size adaptation
- ⚡ Stable implementation of filtering, smoothing, and other estimation strategies
- ⚡ Custom information operators, dense output, and posterior sampling
- ⚡ State-space model factorisations
- ⚡ Physics-enhanced regression
- ⚡ Taylor-series estimation with and without Jets
- ⚡ Compatibility with other JAX-based libraries such as Optax or BlackJAX.
Tutorials:
- AN EASY EXAMPLE: LINK
- EXAMPLES: LINK
- CHOOSING A SOLVER: LINK
- API DOCUMENTATION: LINK
- ISSUE TRACKER: LINK
- BENCHMARKS: LINK
Get the most recent stable version from PyPi:
pip install probdiffeq
This installation assumes that JAX is already available.
To install ProbDiffEq with jax[cpu]
, run
pip install probdiffeq[cpu]
WARNING: This is a research project. Expect rough edges and sudden API changes.
VERSIONING: As long as Probdiffeq is in its initial development phase (version 0.MINOR.PATCH), version numbers are increased as follows:
- Bugfixes and new features increase the PATCH version.
- Breaking changes increase the MINOR version.
See also: semantic versioning.
Start with the quickstart, continue with the Solvers & Solutions
examples and only then move to the Parameter estimation
examples and the API documentation.
The examples show how to interact with the API and explain some valuable facts about probabilistic numerical solvers. They may be more instructive than the API docs.
The advanced examples show applications of probabilistic numerical solvers, often in conjunction with external libraries. For example, this notebook shows how to combine ProbDiffEq with Optax, and this notebook does the same with BlackJAX.
If you find Probdiffeq helpful for your research, please consider citing:
@phdthesis{kramer2024implementing,
title={Implementing probabilistic numerical solvers for differential equations},
author={Kr{\"a}mer, Peter Nicholas},
year={2024},
school={Universit{\"a}t T{\"u}bingen}
}
This thesis contains detailed information about the maths and algorithms behind what is implemented here. A PDF is available at this link.
If you use the solve-and-save-at functionality, please cite
@article{krämer2024adaptive,
title={Adaptive Probabilistic {ODE} Solvers Without Adaptive Memory Requirements},
author={Kr{\"a}mer, Nicholas},
year={2024},
eprint={2410.10530},
archivePrefix={arXiv},
url={https://arxiv.org/abs/2410.10530},
}
This article introduced the algorithm we use. The implementation is slightly different to what we would do for non-probabilistic solvers; see the paper. A PDF is available here and the paper's experiments are here.
Probdiffeq's algorithms have been developed over many years and in multiple research papers. Linking concrete citation information for specific algorithms is a work in progress. Feel free to reach out if you need help determining which works to cite!
Contributions are welcome! Check the existing issues for a "good first issue" and consult the developer documentation.
If you have a feature that you would like to see implemented, create an issue!
ProbDiffEq curates a range of benchmarks that includes various library-internal configurations but also other packages like SciPy, JAX, or Diffrax. To run the benchmark locally, install all dependencies via
pip install .[example,test]
and then either open Jupyter and go to docs/benchmarks
or execute all benchmarks via
make benchmarks-run
Be patient; it might take a while. Afterwards, open Jupyter to look at the result or build the documentation via
mkdocs serve
What do you find?
Here's how to transition from those packages: link.
Is anything missing from this list? Please open an issue or make a pull request.
- diffeqzoo: A library, for example, implementations of differential equations in NumPy and JAX
- probfindiff: Probabilistic numerical finite differences in JAX.