Skip to content

kj3moraes/minigrad.rs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

minigrad.rs

An auto-differentiation engine written in Rust. This engine implements reverse-mode automatic diffrentiation using a tape-based tracking mechanism. A lot of the code is heavily inspired by Rufflewind's excellent post here.

let t = Tape::new();
let x = t.var(0.5);
let y = t.var(4.2);
let z = (x * y).relu();
let grad = z.grad();

// Check that the calculated value is correct
assert!((z.value - 2.1).abs() <= 1e-15);
// Assert that the gradients calculated are correct as well.
assert!((grad.wrt(&x) - y.value).abs() <= 1e-15);
assert!((grad.wrt(&y) - x.value).abs() <= 1e-15);

References

[1] Reverse-mode automatic differentiation: a tutorial by Rufflewind
[2] A Gentle Introduction to torch.autograd by PyTorch
[3] tiberiusferraira/Autograd-Experiments
[4] Mostafa Samir's Blog

About

An auto-differentiation engine written in Rust.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages