Skip to content

Commit

Permalink
Merge pull request #124 from JuliaDiff/ChrisRackauckas-patch-2
Browse files Browse the repository at this point in the history
Remove non-maintained warning
  • Loading branch information
ChrisRackauckas committed Apr 4, 2020
2 parents 2041beb + ed68c8e commit ab4c02c
Showing 1 changed file with 3 additions and 9 deletions.
12 changes: 3 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,15 +7,9 @@

[**See ReverseDiff Usage Examples**](https://github.com/JuliaDiff/ReverseDiff.jl/tree/master/examples)

**Note: While ReverseDiff technically supports Julia v0.7/v1.0 and is somewhat maintained, it
is currently not actively developed. Instead, ForwardDiff/ReverseDiff's maintainers are
focused on the development of a new AD package built on top of [Cassette](https://github.com/jrevels/Cassette.jl).
In the meantime, it might be worth checking out other reverse-mode AD implementations in Nabla.jl,
AutoGrad.jl, Flux.jl, or XGrad.jl.**

ReverseDiff implements methods to take **gradients**, **Jacobians**, **Hessians**, and
higher-order derivatives of native Julia functions (or any callable object, really) using
**reverse mode automatic differentiation (AD)**.
ReverseDiff is a fast and compile-able tape-based **reverse mode automatic differentiation (AD)** that
implements methods to take **gradients**, **Jacobians**, **Hessians**, and
higher-order derivatives of native Julia functions (or any callable object, really).

While performance can vary depending on the functions you evaluate, the algorithms
implemented by ReverseDiff **generally outperform non-AD algorithms in both speed and
Expand Down

0 comments on commit ab4c02c

Please sign in to comment.