A framework for more flexible structure of neural networks with auto-differentiation.
-
Updated
Feb 10, 2018 - MATLAB
A framework for more flexible structure of neural networks with auto-differentiation.
autoD is a lightweight, flexible automatic differentiation for python3 based on numpy.
MinimalGrad is an open-source automatic differentiation library for deep learning that is designed to be lightweight and efficient, with a focus on simplicity and ease of use. It was developed by a team of students at Eötvös Loránd University as part of Advanced software technology class.
A fast auto differentiation engine implemented in C++ 🔥
This repository contains code corresponding to my blogging site.
Automatic derivative calculation of scalar functions.
Calculates partial derivatives of an input function.
C++20 machine learning library with no external dependencies (nanorange used temporarily for C++20 ranges)
MicrogradPlus is an educational project aiming to provide a simple, yet extensible, NumPy-based automatic differentiation library.
Library for auto differentiation based purely on NumPy
Neural Network library made with numpy
This repository is an attempt to create a deep learning framework to aid in faster learning process for newbies in the deep learning field.
A WIP library for performing multi-root searching of one-dimensional transcendental equations using auto-differentiation in Rust
simple C++ auto-differentiation library
MetaAutoDiff is a C++ template library for automatic differentiation in reverse mode.
Differentiable tensor renormalization group
Lagrangian mechanics implemented 3 ways: manually, with auto-diff, and symbolically.
Reversed mode second order automatic differentiation for python (WIP)
Add a description, image, and links to the auto-differentiation topic page so that developers can more easily learn about it.
To associate your repository with the auto-differentiation topic, visit your repo's landing page and select "manage topics."