A framework for more flexible structure of neural networks with auto-differentiation.
-
Updated
Feb 10, 2018 - MATLAB
A framework for more flexible structure of neural networks with auto-differentiation.
Differentiable Gaussian Process implementation for PyTorch
This repository is an attempt to create a deep learning framework to aid in faster learning process for newbies in the deep learning field.
simple C++ auto-differentiation library
C++20 machine learning library with no external dependencies (nanorange used temporarily for C++20 ranges)
Reversed mode second order automatic differentiation for python (WIP)
Differentiable tensor renormalization group
variational quantum circuit simulator in Julia, under GPLv3
AutoDiff DAG constructor, built on numpy and Cython. A Neural Turing Machine and DeepQ agent run on it. Clean code for educational purpose.
🎈 A C++ code generator for the automatic derivation of tensors with linear indexes. Implementation for the lesson Compiling Technology(2020 Spring, advised by Yun Liang) in Peking University.
MetaAutoDiff is a C++ template library for automatic differentiation in reverse mode.
Automatic derivative calculation of scalar functions.
autoD is a lightweight, flexible automatic differentiation for python3 based on numpy.
Neural Network library made with numpy
This repository contains code corresponding to my blogging site.
Tiny calculation graph library
Lagrangian mechanics implemented 3 ways: manually, with auto-diff, and symbolically.
Auto-differentiation library for C++
Add a description, image, and links to the auto-differentiation topic page so that developers can more easily learn about it.
To associate your repository with the auto-differentiation topic, visit your repo's landing page and select "manage topics."