21st century AD
-
Updated
May 9, 2024 - Julia
21st century AD
Forward Mode Automatic Differentiation for Julia
Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
⟨Grassmann-Clifford-Hodge⟩ multilinear differential geometric algebra
forward and reverse mode automatic differentiation primitives for Julia Base + StdLibs
Julia bindings for the Enzyme automatic differentiator
Reverse Mode Automatic Differentiation for Julia
Taylor polynomial expansions in one and several independent variables.
Surrogate modeling and optimization for scientific machine learning (SciML)
Automatic Differentiation Library for Computational and Mathematical Engineering
AD-backend agnostic system defining custom forward and reverse mode rules. This is the light weight core to allow you to define rules for your functions in your packages, without depending on any particular AD system.
DeepONets, (Fourier) Neural Operators, Physics-Informed Neural Operators, and more in Julia
A common interface for quadrature and numerical integration for the SciML scientific machine learning organization
A package for binary and continuous, single and multi-material, truss and continuum, 2D and 3D topology optimization on unstructured meshes using automatic differentiation in Julia.
One More Einsum for Julia! With runtime order-specification and high-level adjoints for AD
Efficient computations with symmetric and non-symmetric tensors with support for automatic differentiation.
Julia port of the Python autograd package.
Reverse-mode automatic differentiation in Julia
An interface to various automatic differentiation backends in Julia.
Add a description, image, and links to the automatic-differentiation topic page so that developers can more easily learn about it.
To associate your repository with the automatic-differentiation topic, visit your repo's landing page and select "manage topics."