Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals.
-
Updated
Jun 27, 2024 - Rust
Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals.
An interface to various automatic differentiation backends in Julia.
A JIT compiler for hybrid quantum programs in PennyLane
A Julia package for differentiating through expectations with Monte-Carlo estimates
Betty: an automatic differentiation library for generalized meta-learning and multilevel optimization
Differentiable optical models as parameterised neural networks in Jax using Zodiax
Automatic differentiation of implicit functions
Repository for automatic differentiation backend types
Small autodiff lib and a simple working feedforward neural net in Haskell on top of it, from scratch, zero-deps.
RNN in Julia for MNIST digit recognition implemented with automatic differentiation. Over 96% accuracy.
A probabilistic programming language that combines automatic differentiation, automatic marginalization, and automatic conditioning within Monte Carlo methods.
Сustom torch style machine learning framework with automatic differentiation implemented on numpy, allows build GANs, VAEs, etc.
Compressible Euler equations solved with finite volume implemented in JAX, plugged into an optimization loop
[Experimental] Graph and Tensor Abstraction for Deep Learning all in Common Lisp
Transparent calculations with uncertainties on the quantities involved (aka "error propagation"); calculation of derivatives.
automatic differentiation made easier for C++
Add a description, image, and links to the autodiff topic page so that developers can more easily learn about it.
To associate your repository with the autodiff topic, visit your repo's landing page and select "manage topics."