You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Oh yeah the issue here is mixedactivity precisely. Marking it duplicated won't compute the derivative of any in-place values (since we can't update the float in db in place, passing it as a duplicated cannot let us change it).
Active variables are used for immutable variables (like Float64), whereas Duplicated variables are used for mutable variables (like Vector{Float64}). Speciically, since Active variables are immutable, functions with Active inputs will return the adjoint of that variable. In contrast Duplicated variables will have their derivatives +='d in place.
This error indicates that you have a type, like Tuple{Float, Vector{Float64}} that has immutable components and mutable components. Therefore neither Active nor Duplicated can be used for this type.
Please feel free to open a PR if you think that needs more (or perhaps we could issue a warning in reverse mode?)
wsmoses
changed the title
Getting the wrong derivative when differentiating w.r.t a NamedTuple
Mixed Activity of Float, Vector in reverse mode
Mar 5, 2024
Ok excellent, I was hoping I understood the docs. Is there a way that Enzyme could potentially alert the user to this during compilation? The docs suggest I should see an error Type T has mixed internal activity type which I don't see on the main branch.
On Enzyme 0.11.7 (it seems main is broken right now?), I am having some gradients drop when I try to take a derivative w.r.t. a NamedTuple.
A MWE example is
My guess is that this is because the NamedTuple has Mixed activity. If I do the following
the gradients are correct.
The text was updated successfully, but these errors were encountered: