Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enzyme over ForwardDiff.jl #1331

Open
jgreener64 opened this issue Mar 6, 2024 · 2 comments
Open

Enzyme over ForwardDiff.jl #1331

jgreener64 opened this issue Mar 6, 2024 · 2 comments

Comments

@jgreener64
Copy link
Contributor

It would be useful to be able to use Enzyme reverse mode over ForwardDiff.jl. Currently the following errors on Enzyme main (5e4e2ef), ForwardDiff 0.10.36 and Julia 1.10.2:

using Enzyme
import ForwardDiff

Enzyme.API.runtimeActivity!(true)

function f(x)
    grads = ForwardDiff.gradient([x]) do xs
        xs[1]^3
    end
    grads[1]
end

f(2.0) # 12.0

grads = autodiff(Reverse, f, Active, Active(2.0))
ERROR: LoadError: AssertionError: ForwardDiff.GradientConfig{ForwardDiff.Tag{var"#1#2", Float64}, Float64, 1, Vector{ForwardDiff.Dual{ForwardDiff.Tag{var"#1#2", Float64}, Float64, 1}}} has mixed internal activity types. See https://enzyme.mit.edu/julia/dev/#Mixed-Activity for more information
Stacktrace:
  [1] active_reg
    @ ~/.julia/dev/Enzyme/src/compiler.jl:516 [inlined]
  [2] active_reg
    @ ~/.julia/dev/Enzyme/src/compiler.jl:507 [inlined]
  [3] runtime_generic_augfwd(activity::Type{Val{(false, false, true, true)}}, width::Val{1}, ModifiedBetween::Val{(true, true, true, true)}, RT::Val{@NamedTuple{1, 2, 3}}, f::typeof(ForwardDiff.gradient), df::Nothing, primal_1::var"#1#2", shadow_1_1::Nothing, primal_2::Vector{Float64}, shadow_2_1::Vector{Float64}, primal_3::ForwardDiff.GradientConfig{ForwardDiff.Tag{var"#1#2", Float64}, Float64, 1, Vector{ForwardDiff.Dual{ForwardDiff.Tag{var"#1#2", Float64}, Float64, 1}}}, shadow_3_1::ForwardDiff.GradientConfig{ForwardDiff.Tag{var"#1#2", Float64}, Float64, 1, Vector{ForwardDiff.Dual{ForwardDiff.Tag{var"#1#2", Float64}, Float64, 1}}})
    @ Enzyme.Compiler ~/.julia/dev/Enzyme/src/rules/jitrules.jl:66
  [4] gradient
    @ ~/.julia/packages/ForwardDiff/PcZ48/src/gradient.jl:17 [inlined]
  [5] augmented_julia_gradient_2930wrap
    @ ~/.julia/packages/ForwardDiff/PcZ48/src/gradient.jl:0
  [6] macro expansion
    @ ~/.julia/dev/Enzyme/src/compiler.jl:5440 [inlined]
  [7] enzyme_call
    @ ~/.julia/dev/Enzyme/src/compiler.jl:5118 [inlined]
  [8] AugmentedForwardThunk
    @ ~/.julia/dev/Enzyme/src/compiler.jl:5011 [inlined]
  [9] runtime_generic_augfwd(activity::Type{Val{(false, false, true)}}, width::Val{1}, ModifiedBetween::Val{(true, true, true)}, RT::Val{@NamedTuple{1, 2, 3}}, f::typeof(ForwardDiff.gradient), df::Nothing, primal_1::var"#1#2", shadow_1_1::Nothing, primal_2::Vector{Float64}, shadow_2_1::Vector{Float64})
    @ Enzyme.Compiler ~/.julia/dev/Enzyme/src/rules/jitrules.jl:179
 [10] f
    @ ~/dms/molly_dev/enzyme_err34.jl:9 [inlined]
 [11] augmented_julia_f_1471wrap
    @ ~/dms/molly_dev/enzyme_err34.jl:0
 [12] macro expansion
    @ ~/.julia/dev/Enzyme/src/compiler.jl:5440 [inlined]
 [13] enzyme_call
    @ ~/.julia/dev/Enzyme/src/compiler.jl:5118 [inlined]
 [14] (::Enzyme.Compiler.AugmentedForwardThunk{Ptr{Nothing}, Const{typeof(f)}, Duplicated{Any}, Tuple{Active{Float64}}, Val{1}, Val{false}(), @NamedTuple{1, 2, 3, 4, 5, 6, 7}})(fn::Const{typeof(f)}, args::Active{Float64})
    @ Enzyme.Compiler ~/.julia/dev/Enzyme/src/compiler.jl:5011
 [15] autodiff
    @ ~/.julia/dev/Enzyme/src/Enzyme.jl:185 [inlined]
 [16] autodiff(mode::ReverseMode{false, FFIABI, false}, f::typeof(f), ::Type{Active}, args::Active{Float64})
    @ Enzyme ~/.julia/dev/Enzyme/src/Enzyme.jl:287

The problem seems to be the default third argument to ForwardDiff.gradient (https://github.com/JuliaDiff/ForwardDiff.jl/blob/master/src/gradient.jl#L16). It is defined at https://github.com/JuliaDiff/ForwardDiff.jl/blob/master/src/config.jl#L97-L100, I tried and failed to get a version of it working without mixed activity types. Is there another way to get around this?

@wsmoses
Copy link
Member

wsmoses commented Mar 6, 2024 via email

@jgreener64
Copy link
Contributor Author

Okay thanks.

You may be able to just mark the gradient config type as non differentiable.

Tried this but not much luck:

using Enzyme
import ForwardDiff

Enzyme.API.runtimeActivity!(true)

f2(xs) = xs[1]^3
f(x, config) = ForwardDiff.gradient(f2, [x], config)[1]

config = ForwardDiff.GradientConfig(f2, [2.0])
f(2.0, config) # 12.0

autodiff(Reverse, f, Active, Active(2.0), Const(config))
ERROR: LoadError: AssertionError: ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(f2), Float64}, Float64, 1, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(f2), Float64}, Float64, 1}}} has mixed internal activity types. See https://enzyme.mit.edu/julia/dev/#Mixed-Activity for more information
Stacktrace:
  [1] active_reg
    @ ~/.julia/dev/Enzyme/src/compiler.jl:516 [inlined]
  [2] active_reg
    @ ~/.julia/dev/Enzyme/src/compiler.jl:507 [inlined]
  [3] runtime_generic_augfwd(activity::Type{Val{(false, false, true, true, false)}}, width::Val{1}, ModifiedBetween::Val{(true, true, true, true, true)}, RT::Val{@NamedTuple{1, 2, 3}}, f::typeof(ForwardDiff.gradient), df::Nothing, primal_1::typeof(f2), shadow_1_1::Nothing, primal_2::Vector{Float64}, shadow_2_1::Vector{Float64}, primal_3::ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(f2), Float64}, Float64, 1, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(f2), Float64}, Float64, 1}}}, shadow_3_1::Base.RefValue{ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(f2), Float64}, Float64, 1, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(f2), Float64}, Float64, 1}}}}, primal_4::Val{true}, shadow_4_1::Nothing)
    @ Enzyme.Compiler ~/.julia/dev/Enzyme/src/rules/jitrules.jl:66
  [4] gradient
    @ ~/.julia/packages/ForwardDiff/PcZ48/src/gradient.jl:17 [inlined]
  [5] f
    @ ~/dms/molly_dev/enzyme_err34.jl:12 [inlined]
  [6] f
    @ ~/dms/molly_dev/enzyme_err34.jl:0 [inlined]
  [7] diffejulia_f_1455_inner_1wrap
    @ ~/dms/molly_dev/enzyme_err34.jl:0
  [8] macro expansion
    @ ~/.julia/dev/Enzyme/src/compiler.jl:5440 [inlined]
  [9] enzyme_call(::Val{false}, ::Ptr{Nothing}, ::Type{Enzyme.Compiler.CombinedAdjointThunk}, ::Type{Val{1}}, ::Val{false}, ::Type{Tuple{Active{Float64}, Const{ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(f2), Float64}, Float64, 1, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(f2), Float64}, Float64, 1}}}}}}, ::Type{Active{Float64}}, ::Const{typeof(f)}, ::Type{Nothing}, ::Active{Float64}, ::Const{ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(f2), Float64}, Float64, 1, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(f2), Float64}, Float64, 1}}}}, ::Float64)
    @ Enzyme.Compiler ~/.julia/dev/Enzyme/src/compiler.jl:5118
 [10] (::Enzyme.Compiler.CombinedAdjointThunk{Ptr{Nothing}, Const{typeof(f)}, Active{Float64}, Tuple{Active{Float64}, Const{ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(f2), Float64}, Float64, 1, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(f2), Float64}, Float64, 1}}}}}, Val{1}, Val{false}()})(::Const{typeof(f)}, ::Active{Float64}, ::Vararg{Any})
    @ Enzyme.Compiler ~/.julia/dev/Enzyme/src/compiler.jl:5000
 [11] autodiff
    @ ~/.julia/dev/Enzyme/src/Enzyme.jl:275 [inlined]
 [12] autodiff(::ReverseMode{false, FFIABI, false}, ::typeof(f), ::Type{Active}, ::Active{Float64}, ::Const{ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(f2), Float64}, Float64, 1, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(f2), Float64}, Float64, 1}}}})
    @ Enzyme ~/.julia/dev/Enzyme/src/Enzyme.jl:287

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants