-
Notifications
You must be signed in to change notification settings - Fork 57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enzyme over ForwardDiff.jl #1331
Comments
You may be able to just mark the gradient config type as non differentiable.
Part of this issue is the call is type unstable which forces Enzyme to use
either active or duplicated in isolation. Maybe see if it can be type
stabilized?
…On Wed, Mar 6, 2024, 8:57 AM Joe Greener ***@***.***> wrote:
It would be useful to be able to use Enzyme reverse mode over
ForwardDiff.jl. Currently the following errors on Enzyme main (5e4e2ef
<5e4e2ef>),
ForwardDiff 0.10.36 and Julia 1.10.2:
using Enzymeimport ForwardDiff
Enzyme.API.runtimeActivity!(true)
function f(x)
grads = ForwardDiff.gradient([x]) do xs
xs[1]^3
end
grads[1]end
f(2.0) # 12.0
grads = autodiff(Reverse, f, Active, Active(2.0))
ERROR: LoadError: AssertionError: ForwardDiff.GradientConfig{ForwardDiff.Tag{var"#1#2", Float64}, Float64, 1, Vector{ForwardDiff.Dual{ForwardDiff.Tag{var"#1#2", Float64}, Float64, 1}}} has mixed internal activity types. See https://enzyme.mit.edu/julia/dev/#Mixed-Activity for more information
Stacktrace:
[1] active_reg
@ ~/.julia/dev/Enzyme/src/compiler.jl:516 [inlined]
[2] active_reg
@ ~/.julia/dev/Enzyme/src/compiler.jl:507 [inlined]
[3] runtime_generic_augfwd(activity::Type{Val{(false, false, true, true)}}, width::Val{1}, ModifiedBetween::Val{(true, true, true, true)}, ***@***.***{1, 2, 3}}, f::typeof(ForwardDiff.gradient), df::Nothing, primal_1::var"#1#2", shadow_1_1::Nothing, primal_2::Vector{Float64}, shadow_2_1::Vector{Float64}, primal_3::ForwardDiff.GradientConfig{ForwardDiff.Tag{var"#1#2", Float64}, Float64, 1, Vector{ForwardDiff.Dual{ForwardDiff.Tag{var"#1#2", Float64}, Float64, 1}}}, shadow_3_1::ForwardDiff.GradientConfig{ForwardDiff.Tag{var"#1#2", Float64}, Float64, 1, Vector{ForwardDiff.Dual{ForwardDiff.Tag{var"#1#2", Float64}, Float64, 1}}})
@ Enzyme.Compiler ~/.julia/dev/Enzyme/src/rules/jitrules.jl:66
[4] gradient
@ ~/.julia/packages/ForwardDiff/PcZ48/src/gradient.jl:17 [inlined]
[5] augmented_julia_gradient_2930wrap
@ ~/.julia/packages/ForwardDiff/PcZ48/src/gradient.jl:0
[6] macro expansion
@ ~/.julia/dev/Enzyme/src/compiler.jl:5440 [inlined]
[7] enzyme_call
@ ~/.julia/dev/Enzyme/src/compiler.jl:5118 [inlined]
[8] AugmentedForwardThunk
@ ~/.julia/dev/Enzyme/src/compiler.jl:5011 [inlined]
[9] runtime_generic_augfwd(activity::Type{Val{(false, false, true)}}, width::Val{1}, ModifiedBetween::Val{(true, true, true)}, ***@***.***{1, 2, 3}}, f::typeof(ForwardDiff.gradient), df::Nothing, primal_1::var"#1#2", shadow_1_1::Nothing, primal_2::Vector{Float64}, shadow_2_1::Vector{Float64})
@ Enzyme.Compiler ~/.julia/dev/Enzyme/src/rules/jitrules.jl:179
[10] f
@ ~/dms/molly_dev/enzyme_err34.jl:9 [inlined]
[11] augmented_julia_f_1471wrap
@ ~/dms/molly_dev/enzyme_err34.jl:0
[12] macro expansion
@ ~/.julia/dev/Enzyme/src/compiler.jl:5440 [inlined]
[13] enzyme_call
@ ~/.julia/dev/Enzyme/src/compiler.jl:5118 [inlined]
[14] (::Enzyme.Compiler.AugmentedForwardThunk{Ptr{Nothing}, Const{typeof(f)}, Duplicated{Any}, Tuple{Active{Float64}}, Val{1}, Val{false}(), @NamedTuple{1, 2, 3, 4, 5, 6, 7}})(fn::Const{typeof(f)}, args::Active{Float64})
@ Enzyme.Compiler ~/.julia/dev/Enzyme/src/compiler.jl:5011
[15] autodiff
@ ~/.julia/dev/Enzyme/src/Enzyme.jl:185 [inlined]
[16] autodiff(mode::ReverseMode{false, FFIABI, false}, f::typeof(f), ::Type{Active}, args::Active{Float64})
@ Enzyme ~/.julia/dev/Enzyme/src/Enzyme.jl:287
The problem seems to be the default third argument to ForwardDiff.gradient
(
https://github.com/JuliaDiff/ForwardDiff.jl/blob/master/src/gradient.jl#L16).
It is defined at
https://github.com/JuliaDiff/ForwardDiff.jl/blob/master/src/config.jl#L97-L100,
I tried and failed to get a version of it working without mixed activity
types. Is there another way to get around this?
—
Reply to this email directly, view it on GitHub
<#1331>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAJTUXBPAK2XE5H3DLH5D5TYW4VGZAVCNFSM6AAAAABEJG6LXKVHI2DSMVQWIX3LMV43ASLTON2WKOZSGE3TCNZQGYZTMNI>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Okay thanks.
Tried this but not much luck: using Enzyme
import ForwardDiff
Enzyme.API.runtimeActivity!(true)
f2(xs) = xs[1]^3
f(x, config) = ForwardDiff.gradient(f2, [x], config)[1]
config = ForwardDiff.GradientConfig(f2, [2.0])
f(2.0, config) # 12.0
autodiff(Reverse, f, Active, Active(2.0), Const(config))
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
It would be useful to be able to use Enzyme reverse mode over ForwardDiff.jl. Currently the following errors on Enzyme main (5e4e2ef), ForwardDiff 0.10.36 and Julia 1.10.2:
The problem seems to be the default third argument to
ForwardDiff.gradient
(https://github.com/JuliaDiff/ForwardDiff.jl/blob/master/src/gradient.jl#L16). It is defined at https://github.com/JuliaDiff/ForwardDiff.jl/blob/master/src/config.jl#L97-L100, I tried and failed to get a version of it working without mixed activity types. Is there another way to get around this?The text was updated successfully, but these errors were encountered: