-
-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Relationship to AbstractDifferentiation.jl? #8
Comments
AbstractDifferentiation aims to provide APIs for implementing common derivative functions where as this repo is only to house some structs being used in SciML for dispatching methods for different AD implementations. The scope of AbstractDifferentiation is far more and this package only aims to define and export the structs and nothing more. The way these could be made to work together is if AbstractDifferentiation switched from their AD.ForwardDiffBackend() etc to using the ADTypes here like AutoForwardDiff(). |
Just curious, since ADTypes is more recent than AbstractDifferentiation - why did SciML choose not do dispatch on the existing backend types in AbstractDifferentiation? I guess somethings is missing in there? (I'm not involved in AbstractDifferentiation, I just wonder what to use and how in packages that need to use AD but aim to give users control over which AD to use.) |
We were using these structs already before this package was created to house them by defining them in Optimization.jl which was started before |
Oh, of course, sorry!
Yes, I guess type aliases also won't work, because some of the corresponding
Well, speaking from the user type, it would be nice to have a single mechanism of selecting AD-type in an application that mixes SciML with non-SciML packages. :-) Cohesion in the ecosystem an common types to dispatch on are always nice. ADTypes.jl is extremely lightweight though and has no dependencies - I'll open an issue on AbstractDifferentiation.jl and see if there's potential to offer bi-directional conversion between |
@Vaibhavdixit02 , I've put together a package AutoDiffOperators that support both ADTypes and AbstractDifferentiation backends, but currently only for ForwardDiff, Zygote and Enzyme. And it's all still very non-optimized. But @ChrisRackauckas mentioned that you might be interested. It only supportsr functions that map real vectors to scalars or real vectors (at least so far, but my applications will all be like that). It's also intended for applications that may need to do both JVP and VJP (like MGVI.jl), so it allows specifying different backends for each direction as an option (not that I have many backends in, yet ;-) ) and comes with multiplicative Jacobian operator objects. |
That looks like a great project. Will follow it and we can see if it makes sense to switch over to using that in Optimization.jl down the line |
Thanks @Vaibhavdixit02 ! Also, I'm not a backend expert, so while I'll try to improve the package over time, I'll also be more than happy to include suggestions/contributions that add more backends and advanced backend options (re-using tape, etc.) to the backend extensions in AutoDiffOperators. |
How does ADTypes.jl relate to AbstractDifferentiation.jl? Are there any plans to merge the two packages eventually?
The text was updated successfully, but these errors were encountered: