Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix numpy v2 breaking changes #618

Open
wants to merge 3 commits into
base: master
Choose a base branch
from
Open

Fix numpy v2 breaking changes #618

wants to merge 3 commits into from

Conversation

fjosw
Copy link
Collaborator

@fjosw fjosw commented Apr 1, 2024

The upcoming numpy v2.0.0 release introduces a few breaking changes for autograd. This PR fixes these issues:

  • msort was removed from the numpy API. I added conditional imports. The corresponding VJPs are now only defined if the numpy version is <2.
  • np.array(value, copy=False) was removed. The same effect can now be achieved via np.asarray(value).
  • The broadcasting rules for np.solve changed.
  • np.linalg.linalg was renamed to np.linalg._linalg.

One thing that I did not adress in this PR is the changing behavior for np.sign with complex arguments. Instead of the sign of the real part, numpy now returns z /|z|.

On my machine all tests pass with the latest release versions (numpy==1.26.4 scipy==1.12.0) and the current release candidates (numpy==2.0.0.rc1 scipy==1.13.0rc1) but for tests/test_systematic.py::test_sign which fails for numpy v2 with

tests/numpy_utils.py:37: in unary_ufunc_check
    check([comp, matc])
autograd/test_util.py:77: in _combo_check
    _check_grads(fun)(*_args, **dict(_kwargs))
autograd/wrap_util.py:20: in nary_f
    return unary_operator(unary_f, x, *nary_op_args, **nary_op_kwargs)
autograd/test_util.py:56: in check_grads
    check_jvp(f, x)
autograd/test_util.py:43: in check_jvp
    check_equivalent(jvp(x_v)[1], jvp_numeric(x_v))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

x = array(0.+0.j), y = np.complex128(-0.2141399934041388+1.070699966299049j)

because of the aforementioned changes to the behavior of the sign function.

@j-towns
Copy link
Collaborator

j-towns commented Apr 2, 2024

Thanks for this @fjosw. I will try to take a look at some point in the next couple of weeks.

@agriyakhetarpal
Copy link
Collaborator

Hi there, @j-towns! I did see from #620 and I am cognizant of the announcement that autograd will not be maintained anymore, however, this PR would be a great service for dependents of autograd and to the community ((8.2k off GitHub's numbers alone), especially with the immensity of the NumPy v2 release. If there is anything that can be done in this regard as a one-off effort/release (v1.6.3) even after the newly-raised status for the project, it would be great to hear from you. June 16, 2024 is the D-Day for the NumPy release.

@j-towns
Copy link
Collaborator

j-towns commented Jun 19, 2024

Hi @agriyakhetarpal, I emailed you with a proposal, have you seen it?

@agriyakhetarpal
Copy link
Collaborator

Hi @j-towns, thanks for sending it out – just responded there. It went to my spam folder, sadly :( I could have responded much earlier otherwise.

@j-towns
Copy link
Collaborator

j-towns commented Jul 6, 2024

@agriyakhetarpal I emailed you again (11 days ago), did you see that? I have also now emailed @fjosw with a similar proposal.

@fjosw
Copy link
Collaborator Author

fjosw commented Jul 7, 2024

Thanks @j-towns, I just sent you a reply.

@agriyakhetarpal
Copy link
Collaborator

Thank you for the ping, @j-towns – just responded!

@fjosw
Copy link
Collaborator Author

fjosw commented Jul 26, 2024

I looked a bit into the changes related to the complex sign function today. It is my understanding that the test fails because the numerical approximation of jvp via finite difference does not work when the sign function is defined as $\mathrm{sgn}(z)=z/|z|$. For this reason I disabled the complex checks for the sign function.

However, I think autograd still does the correct thing when the definition of the sign function changes. I verified explicitly that autograd.holomorphic_grad and jax.grad with holomorphic=True give the same results when differentiating functions containing np.sign (jax also uses $\mathrm{sgn}(z)=z/|z|$).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants