New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
One virtualenv to rule them all #1070
Comments
Naturally I'd welcome input from anyone either to cheer me on or to point out how completely insane the idea is for all the reasons I hadn't thought about... |
Just had a quick catch-up with @ntoll to talk over this approach. He's agreed that it's worth trying. One of the issues I've faced early on is the IPython Kernel handoff. We both agreed that it using the IPython kernel for REPL may well be more trouble than it's worth. It brings layers of complexity and we basically just use it as a simple REPL. For now, I'll kick that decision down the road but I'll almost certainly be looking for a simpler REPL solution for Python3 runners |
Would this require internet access during installation? Can make it a problem at some schools. I think it should be able to install from USB keys. |
No -- my plan is for us to ship wheels with the redistributables. Obviously there's a certain tension here: people who want to install 3rd party packages (who presumably have internet access); and schools without easy internet access (who presumably won't be able to install 3rd party packages). |
The installer should just include everything for the default modes, the third party stuff will not be as problematic even for schools with limited connection. Most places have at least some internet access, but sometimes a Wifi-network or shared broadband connection can break down if all pupils are trying to download the installer at the same time. |
Thanks @dybber. Yes, that's basically my plan. We ship the installer as we do now. The key difference is that the "installed" Python only includes the libraries for the core editor functionality. It creates a virtualenv on startup and pip installs the mode libraries from wheels which have shipped within the installer. Any third party stuff will be installed into the same venv. I'm trying to separate out, essentially, our edit-time dependencies from our run-time dependencies. At the moment -- even ignoring IPython kernel issues -- we're sometimes involved in an awkward hand-off between "built-in" packages (ie those installed directly as shipped) and our run-time environment. Where "run-time" includes, as I see it, things like REPLs for external devices. This is still a big experiment on a separate branch for now; I'm hoping that it will simplify our codebase a bit, and our runtime a bit. But it might end up being more trouble than it's worth. |
I've just pushed a version which does just enough for Python 3 & Pgzero programs to run. The key difference is that virtual environment functionality is now handled in its own class in a separate module. This will obviously evolve, but for now it's a central point of venv logic |
Latest commit seems to be 4 days old, or perhaps I'm looking in the wrong branch. |
Cough Sorry; I didn't actually say which branch I was working on: |
I've started to pull the wheels across in the Windows installer. But I'm not clear where they should go when they arrive. At present I'm putting them in But I don't know what the layout looks like on MacOS (I'm ignoring Linux for now). And I can't see what configuration the MacOS installer builder uses, even if that were to help. In any case, the code as it stands in virtual_environment.py will try to create a baseline environment from shipped wheels, but will fall back to PyPI -- so that would work in any case. But, obviously, the point is to pre-ship with a nearly-built environment rather than, as @dybber points out, hitting the school's network all at one go. So... can someone enlighten me both as to: how the macos builder works; and what the layout is when it gets there, please? Thanks |
Per our conversation last night, I've pushed everything as it currently stands. It's still rough but nearly ready. Key points: Approach:
Code changes:
Still to do:
|
(If you're reading the previous long comment by email, you might want to go to Github as I've updated it several times since first adding) |
I've now raised PR #1072 against Mu itself |
The debugger is right on the line between needing Mu's own functionality (which needs whatever environment Mu itself is running in) and running the code itself (which needs the mode's virtual environment). In principle this could be as simple as adding a PYTHONPATH to the runner process which includes the directory mu is in. Unfortunately... it's not quite so simple: simply making the Mu package importable doesn't give us the PyQt and other core dependencies. In short, for the debugger to function as it does, we need to be running two virtual environments simultaneously: the Mu core venv and the runtime (modal) venv. It might be possible to build a PYTHONPATH from one of those two environments, basically simulating one virtual environment within another. But that's just the kind of hack we were trying to move away from here. However... since it's very localised (only the debugger) it might not be the worst thing in the world. |
Refactored the debugger code so it doesn't need to rely on any PyQt imports (ie so it can be run from within the runtime virtualenv). However... that meant moving the QLocale logic out of init.py which we put it so that gettext was always installed whenever a module needed it. As things stand, it's not enough to place it in, eg, |
|
Closed by #1072 |
With the work in the use-venv branch we're setting up a runtime environment which will include all the dependencies needed by various modes (pygame, flask etc.) as well as allowing users to
pip install
their own.But I've realised that we can go a step further. We can distinguish between dependencies which must be there for the Mu editor to run at all (basically stdlib, PyQt and a few others) and those which are needed for any particular mode to run (pygame, flask etc.)
In other words, Mu ships with its own deps and -- probably -- some wheels for instant install. On first startup it creates a venv and pip installs from the local wheels. We keep track of which were pre-installed so that on the [3rd Party Packages] tab we keep it clean and only show the ones the user has added.
Why? Mostly as a separation of concerns when we're trying to run / package. If we keep the core install-time dependencies tight it should simplify the issues we're having now with getting installer packages to jump through hoops. Maybe.
I'm going to try for a proof-of-concept branch, starting from the current use-venv work, to see if the idea has legs.
The text was updated successfully, but these errors were encountered: