-
-
Notifications
You must be signed in to change notification settings - Fork 608
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pullback
's back
returns unexpected size if some parameters are not used
#1601
Comments
I think, I am seeing a similar issue:
The second gradient should be the same as the first, a vector of length 1 with zero component, not the nothing object. Likely this issue is in the initialization of the cache in the pullback function (interface.jl): |
Zygote uses
Flux's julia> v, re = Flux.destructure(([1,2], [3,4,5]))
([1, 2, 3, 4, 5], Flux.var"#61#63"{Tuple{Vector{Int64}, Vector{Int64}}}(([1, 2], [3, 4, 5])))
julia> Zygote.gradient(v -> sum(re(v)[1]), rand(5))
([1.0, 1.0],)
julia> Zygote.gradient(v -> sum(re(v)[2]), rand(500))
([1.0, 1.0, 1.0],) |
|
1616: Warn on reconstruct length mismatch r=CarloLucibello a=ToucheSir Ref. #1601. This is kept as a plain warning for backwards compat, but perhaps we want to consider it a bugfix and error/depwarn instead? ### PR Checklist - [x] Tests are added - [ ] Entry in NEWS.md - [ ] Documentation, if applicable - [ ] API changes require approval from a committer (different from the author, if applicable) Co-authored-by: Brian Chen <ToucheSir@users.noreply.github.com>
This behavior can lead to a
DimensionMismatch
error inDiffEqSensitivity
. I don't know if this is intended behaviour, but I feel like there should be a warning.Minimal example:
Returns:
Edit:
To be clear, I would have expected:
The text was updated successfully, but these errors were encountered: