Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Relationship to AbstractDifferentiation.jl? #8

Closed
oschulz opened this issue Jun 3, 2023 · 7 comments
Closed

Relationship to AbstractDifferentiation.jl? #8

oschulz opened this issue Jun 3, 2023 · 7 comments

Comments

@oschulz
Copy link

oschulz commented Jun 3, 2023

How does ADTypes.jl relate to AbstractDifferentiation.jl? Are there any plans to merge the two packages eventually?

@Vaibhavdixit02
Copy link
Member

AbstractDifferentiation aims to provide APIs for implementing common derivative functions where as this repo is only to house some structs being used in SciML for dispatching methods for different AD implementations. The scope of AbstractDifferentiation is far more and this package only aims to define and export the structs and nothing more. The way these could be made to work together is if AbstractDifferentiation switched from their AD.ForwardDiffBackend() etc to using the ADTypes here like AutoForwardDiff().

@oschulz
Copy link
Author

oschulz commented Jun 4, 2023

Just curious, since ADTypes is more recent than AbstractDifferentiation - why did SciML choose not do dispatch on the existing backend types in AbstractDifferentiation? I guess somethings is missing in there?

(I'm not involved in AbstractDifferentiation, I just wonder what to use and how in packages that need to use AD but aim to give users control over which AD to use.)

@Vaibhavdixit02
Copy link
Member

We were using these structs already before this package was created to house them by defining them in Optimization.jl which was started before AbstractDifferentiation so it would be a breaking change with not much of an advantage to switch to using its structs.

@oschulz
Copy link
Author

oschulz commented Jun 5, 2023

We were using these structs already ... before AbstractDifferentiation ...

Oh, of course, sorry!

so it would be a breaking change

Yes, I guess type aliases also won't work, because some of the corresponding ADTypes and AbstractDifferentiation have different content.

with not much of an advantage to switch to using its structs.

Well, speaking from the user type, it would be nice to have a single mechanism of selecting AD-type in an application that mixes SciML with non-SciML packages. :-) Cohesion in the ecosystem an common types to dispatch on are always nice.

ADTypes.jl is extremely lightweight though and has no dependencies - I'll open an issue on AbstractDifferentiation.jl and see if there's potential to offer bi-directional conversion between ADTypes.AbstractADType and AbstractDifferentiation.AbstractBackend on their side.

@oschulz
Copy link
Author

oschulz commented Jun 26, 2023

@Vaibhavdixit02 , I've put together a package AutoDiffOperators that support both ADTypes and AbstractDifferentiation backends, but currently only for ForwardDiff, Zygote and Enzyme. And it's all still very non-optimized. But @ChrisRackauckas mentioned that you might be interested.

It only supportsr functions that map real vectors to scalars or real vectors (at least so far, but my applications will all be like that). It's also intended for applications that may need to do both JVP and VJP (like MGVI.jl), so it allows specifying different backends for each direction as an option (not that I have many backends in, yet ;-) ) and comes with multiplicative Jacobian operator objects.

@Vaibhavdixit02
Copy link
Member

That looks like a great project. Will follow it and we can see if it makes sense to switch over to using that in Optimization.jl down the line

@oschulz
Copy link
Author

oschulz commented Jun 27, 2023

Thanks @Vaibhavdixit02 ! Also, I'm not a backend expert, so while I'll try to improve the package over time, I'll also be more than happy to include suggestions/contributions that add more backends and advanced backend options (re-using tape, etc.) to the backend extensions in AutoDiffOperators.

@oschulz oschulz closed this as completed Jun 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants