21st century AD
-
Updated
May 9, 2024 - Julia
21st century AD
Forward Mode Automatic Differentiation for Julia
forward and reverse mode automatic differentiation primitives for Julia Base + StdLibs
Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
Surrogate modeling and optimization for scientific machine learning (SciML)
AD-backend agnostic system defining custom forward and reverse mode rules. This is the light weight core to allow you to define rules for your functions in your packages, without depending on any particular AD system.
Julia bindings for the Enzyme automatic differentiator
Automatic Differentiation Library for Computational and Mathematical Engineering
Reverse Mode Automatic Differentiation for Julia
Taylor polynomial expansions in one and several independent variables.
⟨Grassmann-Clifford-Hodge⟩ multilinear differential geometric algebra
DeepONets, (Fourier) Neural Operators, Physics-Informed Neural Operators, and more in Julia
Efficient computations with symmetric and non-symmetric tensors with support for automatic differentiation.
A common interface for quadrature and numerical integration for the SciML scientific machine learning organization
A package for binary and continuous, single and multi-material, truss and continuum, 2D and 3D topology optimization on unstructured meshes using automatic differentiation in Julia.
Julia port of the Python autograd package.
Automatic Options Hedging and Backtesting
One More Einsum for Julia! With runtime order-specification and high-level adjoints for AD
ODE integration using Taylor's method, and more, in Julia
Add a description, image, and links to the automatic-differentiation topic page so that developers can more easily learn about it.
To associate your repository with the automatic-differentiation topic, visit your repo's landing page and select "manage topics."