Request for feedback: Unifying on JAX backend for future pyhf development #2470
Replies: 2 comments 1 reply
-
Hi @matthewfeickert, thanks for bringing this discussion up! From our experience in zfit, I would also tend to JAX, it's our planned backend for "zfit2", for all the reasons that you mentioned. One backend allows to max out the capabilities of the backend itself. Is dropping numpy also needed? I think it's a good backup backend that should be easy to maintain and keep compatibility; as it doesn't use any kind of magic, everything can be easily emulated using pure Python. While the price is numerical differentiability, that can easily be added. The advantage is that it could be easier to install without jax (I remember Jim mentioned he had difficulties in the past with both JAX/TensorFlow, but that was also a while ago) and could be a simple verification. Maybe it can be a "let's try and drop if it's more of a burden"? With the array API, I have the feeling that these ecosystem may comes closer together with time. But overall, I welcome the change for pyhf, from my own impression of pyhfs internals, I second the guess of possible gains in speed and internal improvements mentioned. (besides, with some planned refactoring on pyhf side and zfit2, we can also see again how to better interoperate and what is where between the fitting packages) |
Beta Was this translation helpful? Give feedback.
-
Thank you for reaching out and considering our opinions about the future direction of pyhf. I have discussed this topic with @dvandyk and @nikoladze and we have collected our opinions here. Overall, we believe transitioning to JAX is a positive step. We are supportive of the decision to drop the TensorFlow and PyTorch backends in favour of JAX. However, we do have reservations about completely dropping support for the NumPy backend. While we understand the rationale behind this proposed change, as it could potentially streamline development and optimize performance, the ability to work with plain NumPy arrays has been one of the appealing aspects of pyhf, particularly for experimenting with the code. Additionally, it's worth noting that transitioning exclusively to a JAX backend may exclude existing codebases built upon pyhf that rely on the NumPy backend and may not be autodiffable. In conclusion, while we are supportive of the proposed shift towards JAX and the benefits it offers, we advocate for maintaining support for the NumPy backend rather than entirely discontinuing it. |
Beta Was this translation helpful? Give feedback.
-
Following the 2023 pyhf Users and Developers Workshop we (the dev team) have been thinking hard about how to improve the future of
pyhf
. Given the current landscape of scientific Python open source tools, the community use of the existing pyhf backends, and the choices of other tools in the broader Scikit-HEP, PyHEP, and Scientific Python ecosystems, we are strongly considering unifying on JAX as a computational backend. This would mean that in a future release pyhf would drop support for the NumPy, TensorFlow, PyTorch backends and become a JAX based library moving forward.The primary motivation for unifying on JAX as a backend is that it would allow for us to refactor the pyhf internals for increased performance and differentiability, allow for more customization of the modifiers, and allow for us to separate out the inference code from the workspace construction. Automatic differentiation would also become a default.
Before we start work on what would be a drastic design change to the internals we want to solicit your feedback as users and contributors. Our goal is for the user experience to improve, with more flexible models, faster fits, and small public API/UX changes. Please respond with your thoughts and input, both positive and negative.
Beta Was this translation helpful? Give feedback.
All reactions