You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I went through the documentation and examples and could not find the reason why the MWE below does not work.
I would expect that it is possible to pre-allocate the input and result arrays that are used in autodiff and
run autodiff on different values. Hoever, on the second run the results are not updated.
using Enzyme, Test
struct Gradient
f::Function
bx::Vector{Float64}
y::Vector{Float64}
by::Vector{Float64}
end
function (G::Gradient)(x::Vector{Float64})
Enzyme.autodiff(Reverse, G.f, Duplicated(x, G.bx), Duplicated(G.y, G.by));
return G.bx
end
function f(x::Array{Float64}, y::Array{Float64})
y[1] = x[1] * x[1] + x[2] * x[1]
return nothing
end;
bx = [0.0, 0.0]
y = [0.0]
by = [1.0];
G = Gradient(f, bx, y, by)
x = [2.0, 2.0];
@test G(x) == [6.0, 2.0] # test passes
x = [3.0, 2.0];
@test G(x) == [8.0, 3.0] # test fails. Evaluated: [6.0, 2.0] == [8.0, 3.0]
The text was updated successfully, but these errors were encountered:
You haven't reset dy = 1 and dx = 0 between invocations.
Reverse mode will += the derivative results in, and similarly 0 out the derivative output when it propagates (this is required to get correct behavior inside a loop -- if you were to run the same y[1] = ... code twice in the same function to be differentiated, it would be the same result rather than adding the result twice.
I went through the documentation and examples and could not find the reason why the MWE below does not work.
I would expect that it is possible to pre-allocate the input and result arrays that are used in
autodiff
andrun
autodiff
on different values. Hoever, on the second run the results are not updated.The text was updated successfully, but these errors were encountered: