New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
remove pin_run_as_build #44
Conversation
Hi! This is the friendly automated conda-forge-linting service. I just wanted to let you know that I linted all conda-recipes in your PR ( |
You told me that |
You also don't want run_exports in numy for exactly the same reason. You don't always want to impose its pin on things that don't need to be pinned. |
PS: numpy should essentially never be in the build section. It may be in the host section. Each of these are valid solutions for numpy: either (run_exports in numpy or pin_run_as_build for numpy, ignore_run_exports where it need not be pinned) or (no global pinning for numpy, explicit pin_compatible in recipes that need it). We do the latter in defaults. |
I think |
I think If I understand your rationale you want to let accidental pins happen instead of not pinning when it is not needed, right? If so, I understand the logic but we currently have docs that tell users how to pin numpy. Wouldn't it be a matter of updating those docs? |
Thanks. I'm still not used for the cb3 naming. 😄
Exactly. We are planning to do this for boost, right? boost need not be pinned when using header only parts. (This is what defaults does for boost)
We make mistakes in the reviews. For example, the brand new package https://github.com/conda-forge/pyprism-feedstock/blob/master/recipe/meta.yaml |
we already expect users to do something like If you choose the former option here, you are imposing a large cost to us, since all of our recipes already use the latter option. If you want to make that switch, I would ask that you help submit PRs to all of our recipes that use the latter option. |
Sorry Mike, didn't follow that last part, which is the option you are preferring? |
explicit pinning in recipes. Not using pin_run_as_build or run_exports in cases where there is dual use for a given package. {{ pin_compatible('numpy') }} That's ~equivalent to the older |
Indeed we do, but like @msarahan said we chose that kind of error in lieu of the high CI cost that |
So, Mike, is that opening the door to NumPy matrix builds again or does it behave differently with cb3? |
Recipes not in conda-forge, but in defaults are
No, I'm not suggesting we go to |
I did not mean to say you want us to go there, but that is what would happen in that case if I understand it correctly. |
No. what would happen is that if the package was built with numpy 1.14, then it will only be usable with numpy >=1.14. If you want it usable with numpy 1.11, fix the recipe if the package uses the C API or add |
no, what I meant with
|
TL;DR what @msarahan is proposing maps to our current pinning in https://conda-forge.org/docs/meta.html#building-against-numpyv then, right? |
Yes. It replaces
with meta.yaml:
and conda_build_config.yaml:
|
What happens is exactly the same in both proposals. The difference is what happens by default. pin by default or don't pin by default. @msarahan's proposal special cases numpy, by making it not pin by default whereas others like openblas, zlib, etc are pin by default. |
No, it is not pin by default. Your examples are poor. Boost is the only comparable package. Sometimes you want a pin, sometimes you don't. These packages are special, and a special case is appropriate. |
Could I ask that we add this as agenda item for the next meeting (if we haven't already)? |
It seems to me the crux of this is that the C-level libs need different pinning than the python modules so why not use split packages? |
Even though that makes sense in the packaging world we would have a huge resistance from the python folks b/c they want to declare the dependencies as close as possible to their |
This would be idea, unfortunately NumPy does not off a method to easily split itself into the C and Python parts. For better or worse, NumPy (and the My feeling is that there are far more packages that depend on the NumPy Python API and therefore do not need any pinning. Therefore an unpinned numpy should be the default. That said, when the C ABI is used and NumPy is not pinned bad things can happen and so I can understand the desire to pin by default. |
@jjhelmus good summary! Let's pause this discussion here and move this to the meeting. At the end of the day the choices are:
The pros and cons are:
@isuruf and others please edit my comment here at your leisure to add/remove whatever you want. We can take this message as a topic for the meeting. @msarahan edited to add downside that pinning by default will require changes to Anaconda's recipes. This need not be a blocking change, it can be done over time. However, having both styles in recipes may confuse people. |
I cannot see this as a valid complaint. Split packages are implementation details. No one should really care so long as the top-level, user-facing package isn't something they strongly dislike, e.g. "numerical-python" instead of "numpy". I believe not doing split packages in this case adds technical debt to our ecosystem. |
@mingwandroid would that require people to add |
Moving to using |
Splitting Back to @ocefpaf's point though, think we have a lot to mull over before our next meeting when we can dig into this a bit more. |
Some statistics (list of feedstocks might be a few hours old) Recipes with numpy: 667 Other pinnings. Except for the first line others are broken if they use the numpy C API because even with the
Incorrect pins (start with |
These are either abandoned or they did not have a new releases in ages. Not sure it is worth fixing those.
Some of may be worth fixing. |
(I should fix |
I'll fix up |
Not all of these are incorrect. If a packages does not use the NumPy C API but depends on a feature or bug fix from a particular NumPy version then having a lower bound on the version is perfectly reasonable. Such a requirement is typically not needed in the build section although there are odd |
You mean host section here I think? |
Add script to check proposed changes to repodata
It is not good to do numpy here. Better to do it on the recipe level, not the variant level, as documented at https://conda.io/docs/user-guide/tasks/build-packages/variants.html#pinning-at-the-recipe-level
The reason is that when numpy is purely used for its python interface, not its C interface, we don't want to pin it in run depends at all. If we define the pin_run_as_build value, rather than using pin_compatible, it will always have a run constraint, even when we don't want it (i.e. with numpy's python interface only)