Skip to content
This repository has been archived by the owner on May 30, 2021. It is now read-only.

Roger-luo/YAAD.jl

master
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
src
 
 
 
 
 
 
 
 
 
 
 
 
 
 

YAAD.jl

Build Status

Yet Another Automatic Differentiation package in Julia.

Installation

Press ] and use pkg mode in Julia REPL, then type:

pkg> add YAAD

Introduction

You may want to check my blog post about it: Implement AD with Julia in ONE day

This project aims to provide a similar interface with PyTorch's autograd, while keeping things simple. The core implementation only contains a straight-forward 200 line of Julia. It is highly inspired by AutoGrad.jl and PyTorch

Every operation will directly return a CachedNode, which constructs a computation graph dynamically without using a global tape.

NOTE: This project is for self-use at the moment, it will be a place for me to do AD related experimental coding, I don't guarantee the consistency and stability between versions (different version can be in-compatible). For practical usage, I would suggest you try Flux.Tracker or Zygote. They may have better performance and are aimed to be non-experimental projects.

Usage

It is simple. Mark what you want to differentiate with Variable, which contains value and grad. Each time you try to backward evaluate, the gradient will be accumulated to grad.

using LinearAlgebra
x1, x2 = Variable(rand(30, 30)), Variable(rand(30, 30))
y = tr(x1 * x2) # you get a tracked value here
backward(y) # backward propagation
print(x1.grad) # this is where gradient goes

Or you can just register your own

# first define how you want to create a node in computation graph
sin(x::AbstractNode) = register(sin, x)

# then define the gradient
gradient(::typeof(sin), grad, output, x) = grad * cos(x)

License

Apache License Version 2.0