Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

running autodiff twice leads to bad result when result vector is preallocated #699

Closed
hiemstar opened this issue Apr 6, 2023 · 2 comments

Comments

@hiemstar
Copy link

hiemstar commented Apr 6, 2023

I went through the documentation and examples and could not find the reason why the MWE below does not work.

I would expect that it is possible to pre-allocate the input and result arrays that are used in autodiff and
run autodiff on different values. Hoever, on the second run the results are not updated.

using Enzyme, Test

struct Gradient
    f::Function
    bx::Vector{Float64}
    y::Vector{Float64}
    by::Vector{Float64}
end

function (G::Gradient)(x::Vector{Float64})
    Enzyme.autodiff(Reverse, G.f, Duplicated(x, G.bx), Duplicated(G.y, G.by));
    return G.bx
end

function f(x::Array{Float64}, y::Array{Float64})
    y[1] = x[1] * x[1] + x[2] * x[1]
    return nothing
end;

bx = [0.0, 0.0]
y  = [0.0]
by = [1.0];

G = Gradient(f, bx, y, by)

x  = [2.0, 2.0];
@test G(x) == [6.0, 2.0] # test passes

x  = [3.0, 2.0];
@test G(x) == [8.0, 3.0] # test fails. Evaluated: [6.0, 2.0] == [8.0, 3.0]
@wsmoses
Copy link
Member

wsmoses commented Apr 6, 2023

You haven't reset dy = 1 and dx = 0 between invocations.

Reverse mode will += the derivative results in, and similarly 0 out the derivative output when it propagates (this is required to get correct behavior inside a loop -- if you were to run the same y[1] = ... code twice in the same function to be differentiated, it would be the same result rather than adding the result twice.

@hiemstar
Copy link
Author

hiemstar commented Apr 6, 2023

Got it. Thanks for the quick help!

@hiemstar hiemstar closed this as completed Apr 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants