Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pypy next steps #867

Closed
39 of 41 tasks
ericdill opened this issue Sep 18, 2019 · 68 comments
Closed
39 of 41 tasks

pypy next steps #867

ericdill opened this issue Sep 18, 2019 · 68 comments
Assignees

Comments

@ericdill
Copy link
Member

ericdill commented Sep 18, 2019

TODO:

Then, start building packages

@mattip
Copy link
Contributor

mattip commented Oct 18, 2019

Hi. I believe this is a follow up to conda-forge/conda-forge-pinning-feedstock#225 which proposed (as far as I understand it, I may be wrong):

  • modify recipe/conda_build_config.yaml to add pypy tags
  • fix any breakage for parsing the longer tag
  • add required pypy versions somewhere
  • rebuild all packages with this tag.

@CJ-Wright
Copy link
Member

I think there are some things on the migration/bot side that need to be done (not that this is blocking the other steps except for the rebuild).

  • A way to have the bot update the conda_forge_config.yaml
  • A way to scope migrations (in case we don't want all the packages to be rebuilt)

Aside from these the migration will look like the arch rebuild.

@mattip
Copy link
Contributor

mattip commented Nov 2, 2019

Once the Python 3.8 rollout activity dies down, I would love to move this issue forward.

@mattip
Copy link
Contributor

mattip commented Nov 18, 2019

@CJ-Wright @ocefpaf ping

@CJ-Wright
Copy link
Member

@mariusvniekerk do you think it better to enhance the current cfep9 migration spec in the bot or make a bespoke migrator class?

@CJ-Wright
Copy link
Member

@mattip do you have a list of packages that you want to build this out for? You don't need to list everything, we can get the needed deps for the packages you want to rebuild.

@mattip
Copy link
Contributor

mattip commented Nov 18, 2019

numpy, scipy, pandas, matplotlib would be a great start

@mbargull
Copy link
Member

Say yes / no to the spec, point out potential pitfalls

Anyone got a link to the mentioned "spec"?

@mattip
Copy link
Contributor

mattip commented Nov 18, 2019

Perhaps the reference is to this meeting's minutes which reference this plan

@CJ-Wright
Copy link
Member

Also: dask, streamz, and xonsh. Are there issues with the library search order and noarch support?

@msarahan
Copy link
Member

I put together this slightly more fleshed-out plan: https://github.com/msarahan/pypy-python-structure

Compared to the earlier plan, it is much less disruptive in terms of hotfixes. However, renaming the python package is rather dramatic (though completely reasonable in my opinion) and this probably needs a lot of community review.

I hope it's enough to help you push forward.

@mattip
Copy link
Contributor

mattip commented Nov 20, 2019

I commented there. How does this affect user's experience? Would a user ever get into a situation where they have two interpreters in the same conda environment?

@isuruf
Copy link
Member

isuruf commented Nov 20, 2019

A couple of questions.

  1. Can we get rid of _python_impl (which enforces that two interpreters are not in the same environment) and use run_exports to do the same. This might reduce confusion as it results in only 2 packages instead of 3.
  2. Do we want to hotfix existing packages to enforce cpython or allow existing packages work with cpyext?

@mattip
Copy link
Contributor

mattip commented Nov 20, 2019

allow existing packages work with cpyext

I am not sure what this means. Pure python packages will work as-is. C-extension packages will need to be rebuilt with the relevant PyPy interpreter each time PyPy releases a new major.minor version, since the c-extension modules built with CPython cannot be loaded into PyPy and visa-versa. In fact. each time PyPy changes its major.minor version, the modules will need rebuilding. Those changes should happen less frequently now that cpyext is beginning to stabilize. Note that PyPy versions have two components: the python side, which is compatible with CPython and reflected in sys.version_info, and the cpyext ABI side, which is reflected in sys.pypy_version_info. PyPy's last release was pypy_verision_info == 7.2.0 and version_info 2.7.13 and 3.6.9 (matching CPython2.7.13 and CPython3.6.9). The next, which should happen by the end of the year, will increment the pypy_version_info to 7.3.0.

@isuruf
Copy link
Member

isuruf commented Nov 20, 2019

Ah, I must have read about cpyext wrong. Thanks for the info.

@mbargull
Copy link
Member

@mattip, thanks for the link to the Google Doc.
@msarahan, thanks for the refined version. That one is much more in line with what I had in mind (cpython and pypy).

I commented there.

xref: msarahan/pypy-python-structure@7b71aed#r36041947 (just so that those comments don't get lost)


When I have further questions, I'll post them at https://github.com/msarahan/pypy-python-structure/issues

@msarahan
Copy link
Member

Can we get rid of _python_impl (which enforces that two interpreters are not in the same environment) and use run_exports to do the same. This might reduce confusion as it results in only 2 packages instead of 3.

I don't think so, unfortunately. There must be a metapackage above to provide the users the choice of implementations and to stand in for the existing references to "python". That can't serve as the mutex, because then you'd have a cyclical dependency. So, you also need a layer below that also converges on one name (that's the mutex part). If you see a better way, I'm all for it, but I think 3 packages is the minimum.

@jjhelmus
Copy link
Contributor

Compared to the earlier plan, it is much less disruptive in terms of hotfixes

If I understand @msarahan's correctly, all the existing python packages would need to be hotfixed to add a dependency on a cpython mutex metapackage. I believe when this is deployed all existing conda environment containing python would suddenly become inconsistent and the solver would be free to install just about anything. I believe that adding the the cpython metapackage as a constraints would accomplish much of the same behavior without creating inconsistent environment.

Similarly, adding the cpython metapackage to constraints to packages with compiled extensions would prevent them from co-existing with pypy assuming it required a conflicting metapackage.

Along these lines I think it is possible to retain the name of the python package while allowing a lower priority python metapackage which requires pypy. The only hotfixing would be adding constrains entries to. I'll put together an proposal.

@jjhelmus
Copy link
Contributor

Noticed that @mbargull made the same observation about inconsistent environment in msarahan/pypy-python-structure#3.

@mbargull
Copy link
Member

Yes, adding hard dependencies via hotfixing is a too disrupting thing, esp. when we are talking about one of the most common packages, python.
Generally, I would prefer a solution that wouldn't require changing the metadata of nearly every single python-dependent package. But if we want to keep the current dependency scheme, I agree that adding constrains on the Python implementation to the python-dependent packages seems like the best working compromise.
Depending on the naming/topology of the Python implementation packages, that constains would either need to be preclusive, e.g., "constrains": [ "pypy <0a0" ], or make things more explicit, e.g., "constrains": [ "python_impl=*=*_cython" ].

@isuruf
Copy link
Member

isuruf commented Feb 2, 2020

Here's a less intrusive plan,

  1. Add "constrains": [ "pypy <0a0" ] to all existing python packages and python dependent packages.
  2. Change the build strings of new python packages with _cpython at the end. (Keep the old python packages as is).
  3. Add a run_exports of python * *_cpython to the python packages.
  4. Add a python 2.7 *_7_1_pypy package which has run_exports of python * *_7_1_pypy and it would depend on the pypy package.

With this, no existing recipes need to change.
noarch: python packages will also be dependent on cpython packages. This is needed until conda-build adds support for pypy as well.
When conda-build adds support, then we can rebuild the noarch: python packages with build: ignore_run_exports: - python.

2, 3 - conda-forge/python-feedstock#309
1 - conda-forge/conda-forge-repodata-patches-feedstock#27

Above two PRs can be merged as is, if you agree.

@isuruf
Copy link
Member

isuruf commented Feb 2, 2020

Also, defaults need to do 1 as well, but they don't need to do 2, 3, 4 for conda-forge to go ahead.

@isuruf
Copy link
Member

isuruf commented Feb 3, 2020

Building for pypy would look like https://github.com/conda-forge/conda-forge-pinning-feedstock/pull/225/files

@isuruf
Copy link
Member

isuruf commented Feb 3, 2020

@mbargull, can you have a look at #867 (comment) ?
I basically took yours and @msarahan's proposal and merged python_impl and python into one package as there isn't any reason to have two.

@jjhelmus
Copy link
Contributor

jjhelmus commented Feb 3, 2020

Depending on _cpython in the build strings would create a split in packages. Those build after the new package would not work with the older python packages because of the build string requirement.

Rather than using build string for the mutex why not add a constrains on the python implementation type via a selector package?

Specifically:

  1. Hotfix the existing python packages and all Python using packages using with a "constrains": [ "python_impl * cpython" ]
  2. Hotfix pypy to add a python_impl *_7_1_pypy to depends.
  3. Add a run_exports of python_impl * *_cpython to the python packages.
  4. Add a python 2.7 package which has a run_exports of python_impl * *_7_1_pypy and it would depend on the pypy package.

This allows hotfixing in the future and the python_impl package can be used to select which Python implementation to use.

@jjhelmus
Copy link
Contributor

jjhelmus commented Feb 3, 2020

I suppose there is also:
0. Create python_impl * cython and python_impl * *_7_1_pypy metapackages packages.

@jakirkham
Copy link
Member

jakirkham commented Mar 5, 2020

Hey all, we are seeing some issues where PyPy is being pulled in for Python 3.6 where CPython was pulled in before. Am a little unclear if this is expected or if we should be doing something differently. Thoughts? 🙂

Edit: Here's a no-op PR ( conda-forge/distributed-feedstock#118 ) with a build with demonstrating this issue.

@jakirkham
Copy link
Member

Proposing pulling the new Python 3.6 packages until someone has time to debug this further. ( conda-forge/admin-requests#15 )

@jakirkham
Copy link
Member

FWIW at a package level, this seems to be a viable workaround. Not the prettiest thing though

# filename: recipe/meta.yaml

package:
  name: blah
  version: 1

requirements:
  host:
    - python
    - python_abi * *_cp36m  # [py36]
  run:
    - python
    - python_abi * *_cp36m  # [py36]

@jakirkham
Copy link
Member

jakirkham commented Mar 5, 2020

Maybe this is the missing piece ( conda-forge/python-feedstock#321 )?

Edit: Looks like Python 2.7 may need this too ( conda-forge/python-feedstock#322 ).

@jakirkham
Copy link
Member

Also it seems like these changes were not applied to Python 3.7. Is that expected?

@isuruf
Copy link
Member

isuruf commented Mar 9, 2020

First PR to add a pypy build conda-forge/certifi-feedstock#53

@h-vetinari
Copy link
Member

Thanks for all the tireless work on this, @isuruf!

@isuruf
Copy link
Member

isuruf commented Mar 10, 2020

@isuruf
Copy link
Member

isuruf commented Mar 10, 2020

Closing now. See https://conda-forge.org/docs/maintainer/knowledge_base.html#pypy-builds for docs

Only remaining issue is patching defaults. To work around that use strict channel priority as indicated in the docs.

@isuruf isuruf closed this as completed Mar 10, 2020
@jakirkham
Copy link
Member

Thanks @isuruf! 😄

@mattip
Copy link
Contributor

mattip commented Sep 26, 2020

Thanks again to the conda forge team for the buildout of PyPy packages.

PyPy has released 7.3.2 which is binary-compatible with 7.3.1.
Can/should we move forward with

  • updating the base package with 7.3.2
  • windows 32 bit PyPy builds. PyPy does not yet offer 64-bit windows. Should this wait for 64 bit windows?
  • PyPy now offers python 3.7: could we roll out a 3.7 offering in parallel to the 3.6?

Is there a process flow for doing any of the above?

@jakirkham
Copy link
Member

That makes sense. Yeah we dropped Windows 32-bit support a long time ago. So I think we would need to wait for 64-bit support for Windows. Adding Python 3.7 PyPy would be great!

Updating the existing Python 3.6 PyPy, should just be a matter of adding a PR to this feedstock. Adding Python 3.7 would be done through a new recipe to staged-recipes, which could be based of the pypy3.6 recipe. We would then want to either update or add a new migration based off of the existing one for pypy.

If anyone else sees something we would need to do here, please hop in :)

@CJ-Wright
Copy link
Member

What is the support plan for pypy3.6? Should we build it with pypy3.7?

@jakirkham
Copy link
Member

For the time being that sounds like a good idea. It's the only PyPy version that we have a decent number of packages built for.

@ghost
Copy link

ghost commented Jan 9, 2021

This is the ultimate proof of readiness for PyPy and it's time that we could simply use it they have put so much work into it over the last 15 years can you guys automate kicking off builds of everything for PyPy?

@mattip
Copy link
Contributor

mattip commented Jan 9, 2021

@brianmingus2 My reply may be inappropriate, please clarify your intention if I missed.

This issue describes the extensive work conda forge completed to make PyPy3.6 available. On Sept 26 I used the closed issue to bring up the new release of PyPy3.7, I probably should have opened a new issue instead. In the mean time, the migration for PyPy3.7 has begun. Some 3.7 packages are already available, more are coming.

@ghost
Copy link

ghost commented Jan 9, 2021

Thank you for the reply, what is the hangup, lack of integration tests requiring manual review?

@beckermr
Copy link
Member

beckermr commented Jan 9, 2021

See the conda forge status page:

conda-forge.org/status

It lists the packages that are done, in pr etc. help fixing failed builds is appreciated!

@mattip
Copy link
Contributor

mattip commented Jan 9, 2021

The PyPy3.7 migration only began 4 days ago. It takes a while for the bot to churn through all the packages.

@asteppke
Copy link

The pypy guys just released an updated 3.7 version that now includes Windows 64-bit support. Is there something that needs help to get this integrated?

@mattip
Copy link
Contributor

mattip commented Apr 16, 2021

@asteppke see conda-forge/pypy3.6-feedstock#39. We need to expand the recipe to download the windows64 pypy2.7 in order to build pypy3.7

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests