Source-to-Source Debuggable Derivatives in Pure Python
-
Updated
Sep 29, 2022 - Python
Source-to-Source Debuggable Derivatives in Pure Python
Transparent calculations with uncertainties on the quantities involved (aka "error propagation"); calculation of derivatives.
AutoBound automatically computes upper and lower bounds on functions.
Betty: an automatic differentiation library for generalized meta-learning and multilevel optimization
Drop-in autodiff for NumPy.
Minimal deep learning library written from scratch in Python, using NumPy/CuPy.
A JIT compiler for hybrid quantum programs in PennyLane
Geometry processing utilities compatible with jax for autodifferentiation.
Сustom torch style machine learning framework with automatic differentiation implemented on numpy, allows build GANs, VAEs, etc.
Differentiable optical models as parameterised neural networks in Jax using Zodiax
JAX-DIPS is a differentiable interfacial PDE solver.
A new lightweight auto-differentation library that directly builds on numpy. Used as a homework for CMU 11785/11685/11485.
A toy deep learning framework implemented in pure Numpy from scratch. Aka homemade PyTorch lol.
Fuzzing Automatic Differentiation in Deep-Learning Libraries (ICSE'23)
NotImplementedError: VJP of gammainc wrt argnum 0 not defined
My solutions to the assignments of dlsys course (CSE599G1: Deep Learning System Spring 2017)
Add a description, image, and links to the autodiff topic page so that developers can more easily learn about it.
To associate your repository with the autodiff topic, visit your repo's landing page and select "manage topics."