Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH: Add an op corresponding to scipy.integrate.quad #290

Open
jessegrabowski opened this issue May 10, 2023 · 0 comments
Open

ENH: Add an op corresponding to scipy.integrate.quad #290

jessegrabowski opened this issue May 10, 2023 · 0 comments

Comments

@jessegrabowski
Copy link
Member

jessegrabowski commented May 10, 2023

This issue is based on a conversation I had with @ricardoV94, about whether it would be possible to add an integration Op. Based on this conversation, a Quad op with gradients should be possible to implement in PyTensor and JAX by directly wrapping scipy.integrate.quad (disclaimer: I am still not clear on nomenclature with respect to vjp, jvp, push foward, pull back, gradient, etc, but everything needed seems to be in this thread).

Numba will be tricker as usual, because of the spotty Numba coverage of scipy. Scipy uses QUADPACK, written in Fortran, to actually do the computation. I'm pretty sure this can be overloaded, but it would take a bit of tinkering.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants