The Aesara Project is a collection of Python projects—built on a fork of Theano named Aesara—that aid in the construction and use of domain-specific compilers for tensor-related computations.
Currently, the Aesara Project primarily serves the probabilistic modeling community through its Probabilistic Programming Language-generating capabilities and model-specific optimizations; however, its scope is not limited to just that. See the discussion here for an explanation of the Aesara Project's goals and history.
- Aesara is a hackable "meta" tensor library that enables the manipulation of mathematical expressions involving multidimensional arrays;
- AePPL provides an intermediate representation for Probabilistic Programming Languages, and automatically derives log-densities for probabilistic models;
- AeHMC provides a symbolic implementation of the HMC and NUTS samplers;
- AeMCMC automatically builds custom samplers for probabilistic models and performs model-specific optimizations such as automatic Rao-Blackwellization, Bayesian conjugation, and marginalization.
In the context of statistical modeling, Aesara allows you to focus only on the model. It provides a single, easily customizable form of a model that can be used to encode all the information needed to produce numerical optimizers and samplers.
In other words, users can build a symbolic graph that represents the mathematical operations performed by the model. This graph can be inspected and modified at runtime—thanks to a fully featured and extensible rewrite system—and all without ever leaving Python. Graphs can then be transpiled to a number of supported target languages (e.g. C, Numba, JAX).
Aesara is modular by design and it is straightforward to add new mathematical operators, new rewrite rules (e.g. numerical stabilizations, optimizations, etc.), and even new target languages.
Aesara opens new possibilities in Machine Learning, exemplified by the AePPL and AeMCMC libraries.
Chat with us via the Gitter link above. It's also compatible with Matrix!