Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use make_zero and Ref inside gradient #1356

Merged
merged 8 commits into from
Mar 25, 2024

Conversation

mcabbott
Copy link
Contributor

@mcabbott mcabbott commented Mar 22, 2024

On Enzyme v0.11.20:

julia> using Enzyme, SparseArrays, StaticArrays

julia> gradient(Reverse, prod, [3.0 4.0])  # ok!
1×2 Matrix{Float64}:
 4.0  3.0

julia> gradient(Reverse, prod, @SArray [3.0 4.0])
1×2 SMatrix{1, 2, Float64, 2} with indices SOneTo(1)×SOneTo(2):
 0.0  0.0

julia> gradient(Reverse, sum, sparse([5.0 0.0 6.0]))
1×3 SparseMatrixCSC{Float64, Int64} with 0 stored entries:
           

julia> gradient(Reverse, abs2, 5.0)
0.0

With this PR:

julia> gradient(Reverse, prod, @SArray [3.0 4.0])
1×2 SMatrix{1, 2, Float64, 2} with indices SOneTo(1)×SOneTo(2):
 4.0  3.0

julia> gradient(Reverse, sum, sparse([5.0 0.0 6.0]))
1×3 SparseMatrixCSC{Float64, Int64} with 2 stored entries:
 1.0      1.0

julia> gradient(Reverse, abs2, 5.0)
10.0

and also:

julia> xy = (x = [1.0, 2.0], y = [3.0, 4.0]);

julia> gradient(Reverse, z -> sum(z.x .* z.y), xy)
(x = [3.0, 4.0], y = [1.0, 2.0])

julia> xp1 = (x = [1.0, 2.0], p = 3);

julia> gradient(Reverse, z -> sum(z.x .^ z.p), xp1)
(x = [3.0, 12.0], p = 3)

julia> xp2 = (x = [1.0, 2.0], p = 3.0);

julia> gradient(Reverse, z -> sum(z.x .^ z.p), xp2)
ERROR: AssertionError: @NamedTuple{x::Vector{Float64}, p::Float64} has mixed internal activity types. See https://enzyme.mit.edu/julia/dev/#Mixed-Activity for more information

Edit: latest version changes the last example to:

julia> gradient(Reverse, z -> sum(z.x .^ z.p), xp2)
(x = [3.0, 12.0], p = 5.545177444479562)

julia> Zygote.gradient(z -> sum(z.x .^ z.p), xp2)[1]
(x = [3.0, 12.0], p = 5.545177444479562)

Not sure the test is in the right place.

Should probably add the sparse and SVector cases as tests too, will try to figure out where.

src/Enzyme.jl Outdated Show resolved Hide resolved
src/Enzyme.jl Show resolved Hide resolved
@codecov-commenter
Copy link

codecov-commenter commented Mar 22, 2024

Codecov Report

Attention: Patch coverage is 92.85714% with 1 lines in your changes are missing coverage. Please review.

Project coverage is 75.34%. Comparing base (f0c5a4e) to head (b7f83f4).

Files Patch % Lines
src/Enzyme.jl 91.66% 1 Missing ⚠️

❗ Your organization needs to install the Codecov GitHub app to enable full functionality.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1356      +/-   ##
==========================================
+ Coverage   75.30%   75.34%   +0.04%     
==========================================
  Files          35       35              
  Lines       10653    10648       -5     
==========================================
+ Hits         8022     8023       +1     
+ Misses       2631     2625       -6     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

docs/src/index.md Outdated Show resolved Hide resolved
@gdalle
Copy link
Contributor

gdalle commented Mar 25, 2024

Can I allow myself a gentle bump for this PR, perhaps followed by a new release? It would be useful for the DifferentiationInterface.jl test suite with non-array objects

@mcabbott
Copy link
Contributor Author

Re tagging a release, I have a draft of what I suggest here: #1334 (comment)

Since at present gradient(Reverse, f, x) is an error for complicated structs, perhaps any changes to what types it returns should be considered before tagging.

@wsmoses wsmoses merged commit 6fac38b into EnzymeAD:main Mar 25, 2024
37 of 46 checks passed
@mcabbott mcabbott deleted the gradient_make_zero branch March 25, 2024 18:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants