Skip to content

Releases: team-boomeraang/cs107-FinalProject

version 2.0

11 Dec 15:56
93d854e
Compare
Choose a tag to compare

This release contains a complete forward-mode implementation of AD, as well as the boomdiff optimization library. Our AD class supports scalar and vector operations for functions of many variables. Optimization methods include gradient descent, momentum, and Adam. New to this release: tutorials for getting started with boomdiff and AD objects, an introduction to optimization with boomdiff, and tutorials for three common applications of optimization methods (linear regression, logistic regression, and neural networks).

Version 1.3

10 Dec 21:20
de6cd98
Compare
Choose a tag to compare

Version 1.3 of boomdiff supports multiple functions and vector operations for automatic differentiation and optimization of functions of many variables.

version 0

29 Nov 17:52
001a702
Compare
Choose a tag to compare

This is the version 0 of our boomdiff package, supporting forward mode autodiff of scalar functions.