Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PySCeS releases #51

Closed
bgoli opened this issue Aug 16, 2021 · 44 comments
Closed

PySCeS releases #51

bgoli opened this issue Aug 16, 2021 · 44 comments
Assignees

Comments

@bgoli
Copy link
Member

bgoli commented Aug 16, 2021

I've assigned a bunch of issues that I think could be addressed for an 0.9.9 release this week (https://github.com/PySCeS/pysces/milestone/1).

Feel free to add or assign yourself issues, whatever can't be done now will be moved to the 1.0 release - as will any new issues (SBML3 support, CI integration, etc.)

@bgoli bgoli added this to the 0.9.9 milestone Aug 16, 2021
@jmrohwer
Copy link
Member

Refer to my comments on the various issues. I'll be able to help as indicated. We need some general consensus on the way forward for some of this.

#53 is ready to be merged if you agree with my final comments.

More general:
Do you think it is necessary to do two separate releases 0.9.9 and 1.0? How much work is the outstanding stuff? I agree that there is a lot of new functionality that needs to be released, especially CVODE via Assimulo, and the ton of recent bug fixes. SBML3 support would be nice but IMO it would not have to be complete, even if we only start with SBML3 Core. CI integration that is something that could in principle also be added at a later stage (e.g. 1.0.1 or 1.1 depending on how we go forward).

Point is, I am really keen to get a 1.0 out and I think the program is at a stage where this is warranted. My concern if we do 0.9.9 now (and this is judging from past experience for both of us) that it is going to be at least another 6 months to 1.0.

@bgoli
Copy link
Member Author

bgoli commented Aug 17, 2021

In principle I don't really mind if we make this release 1.0 and the next 1.1 I just won't have time to do the current 1.0 milestones (but if it's just renaming the milestones that's okay)

@bgoli bgoli modified the milestones: 0.9.9, 1.0, 1.1 Aug 17, 2021
@bgoli
Copy link
Member Author

bgoli commented Aug 18, 2021

@jmrohwer renamed milestones 1.0 and 1.1 and set 1.0 release date for next Friday :-)

@bgoli
Copy link
Member Author

bgoli commented Aug 18, 2021

By the way, what was the post2 release that appeared on pypi last week? There is no source bundle or corresponding github release?

@jmrohwer
Copy link
Member

I posted some wheels for Linux and Windows which we urgently needed for the students in my lab. Basically this bug fix: fe69b0d

When running a simulation with mod.Simulate(userinit=1) it was treating the first simulated time point as zero, even if mod.sim_time did not contain a zero, resulting in an offset. This is relevant for us fitting kinetics to NMR timecourses. To help the students be able to continue with their work, I needed a quick bugfix release 😄

@bgoli
Copy link
Member Author

bgoli commented Aug 18, 2021

That's okay, just always upload a source bundle otherwise the binary and source installs are out of sync. Also it's easy to create a non-production (pre-release) on GitHub then everyone knows what is going on :-)

@jmrohwer
Copy link
Member

Okay will keep this in mind 😃

@bgoli
Copy link
Member Author

bgoli commented Aug 25, 2021

For any future releases, I propose the following general sequence of events.

  • Finalise source (merge "development" branch, update version numbers, dates, dependencies etc.)
  • Finalise documentation (update module)
  • Announce codebase freeze (no new code, only release bugfixes)
  • Test builds on difference OS's, Python releases etc.
  • Freeze codebase
    • Create GitHub release with final version tag (new releases will be x.y.0 with interim releases x.y.z no post1, a, etc)
    • Create PyPI source bundle
    • Upload PyPI source bundle (create PyPI release)
    • Create/upload PyPI binaries
    • Create/upload Anaconda binaries
    • Create "development" branch for future development
  • Announce codebase unfreeze
  • Update RTD
  • Update website

I've also create a wiki page as a reminder note https://github.com/PySCeS/pysces/wiki/PySCeS-releases

@bgoli bgoli closed this as completed Aug 25, 2021
@bgoli bgoli reopened this Aug 25, 2021
@bgoli bgoli changed the title Release 0.9.9 PySCeS releases Aug 25, 2021
@bgoli bgoli modified the milestones: 1.0, 1.1 Aug 25, 2021
@jmrohwer
Copy link
Member

👍 on the sequence of events. I suppose that this means after 1.0 is out, development will occur on a separate "development" branch. We might have to find a way to manage bugfix releases. I am happy with the x.y.z numbering scheme you propose. The question is how often these bugfix releases should be. Put differently, if we find a bug that impacts the students in my lab, I would need to push out a fix ASAP. Not that it happens all that often but it happened recently.

@jmrohwer
Copy link
Member

Issues to be sorted out before the 1.0 release.

@bgoli please comment, so we are on the same page.

  1. I've had a look at the setup.py changes. Making assimulo a hard dependency is not going to work I'm afraid for people using pip installs. There are no binaries on PyPI for Linux for recent Python versions, and the pip install from source errors out. The PyPI releases are outdated (unmaintained?), current latest is 3.0 there, while current assimulo version is 3.2.5. If I can't even get it to work, an average user definitely won't. It's fine for conda but not for PyPI and I would like to retain the possibility to install from PyPI. We could consider an optional dependency as in Make ipyparallel installation optional through setup.py #62. In general I've been struggling to get Assimulo to compile on Linux properly from source, with mixed success. Currently I'm running Manjaro (Arch derivative) and there is an AUR recipe (build script) which works perfectly. But that's only for Arch based distros.
  2. Are we going to implement Make ipyparallel installation optional through setup.py #62, i.e. are you happy with this approach?
  3. How are we going to manage the sequence of events for 1.0 release since there is no development branch? Basically agree when we are both all done with any changes we want to make?
  4. I see you updated the copyright year to 2022 everywhere. Is this intentional? Not quite there yet 😃
  5. What is your take on Ignore case of matplotlib backends #60 and are we going to implement it (later maybe)?
  6. Who is going to do which builds? I don't mind doing the builds since I've got the build system sorted out and set up for all 3 OSs so that the packages are truly portable. As mentioned unfortunately the Anaconda build with fortran-compiler does not produce portable binary packages. Either way it would be good to independently cross check the built binary packages.

That's about it, feel free to add.

@bgoli
Copy link
Member Author

bgoli commented Aug 25, 2021

  1. that is a pity as PyPI is my preferred source to avoid an Anaconda dependency
  2. I need to look at exactly how this optional thing works I was thinking 1.1
  3. yes
  4. yes
  5. just done
  6. if you could that would be cool :-)

@jmrohwer
Copy link
Member

Re 2. I was going to play around with #62 and do some local testing, if it works we might as well do it for 1.0. I guess it would just have to be documented in README.md. Also I'm not sure how Anaconda deals with optional dependencies though.

@bgoli
Copy link
Member Author

bgoli commented Aug 25, 2021

The idea behind the development branch is that it becomes the default branch so that all development is forked from there, so it should always be in a state to merge back into the main and create a release on the main branch (which is in sync with your platform releases). A "bugfix" release would then be creating a point release and building the binaries, unfortunately there is no other way to keep all platforms in sync.

There are ways to simplify the build process but using a development branch is just one way of working - we can use a different flow.

@jmrohwer
Copy link
Member

Maybe investigate continuous integration and automatic builds? I've seen a lot of this around but never messed with it myself.

@bgoli
Copy link
Member Author

bgoli commented Aug 25, 2021

Precisely, I do remember an issue for 1.1 :-) (it's some work to set up but once it is going it works like a charm!)

@bgoli
Copy link
Member Author

bgoli commented Aug 25, 2021

Correction: I just pushed a fix to #58 not sure about #60 I would say something to look into for 1.1

@jmrohwer
Copy link
Member

👍 on #60 for 1.1

@jmrohwer
Copy link
Member

What's your take on distributing a PDF of the documentation together with the release packages (both source and binary)? I'm all set up to do it.

@bgoli
Copy link
Member Author

bgoli commented Aug 25, 2021

Feel free to look into #62 here is my attempt from another project

try:
    from setuptools import setup

    install_requires_src = ['numpy', 'packaging', 'nose']
    extras_requires_src = {
        'sympy': ['sympy'],
        'glpk': ['swiglpk',],
        'sbml': ['python_libsbml', 'lxml',],
        'all': ['sympy', 'swiglpk', 'python_libsbml', 'lxml',],
    }
except:
    from distutils.core import setup

    install_requires_src = []
    extras_requires_src = {}

My question is, if you do setup with no argument does it install 'all' or none, as far as I can tell the later but would be good to confirm?

@bgoli
Copy link
Member Author

bgoli commented Aug 25, 2021

What's your take on distributing a PDF of the documentation together with the release packages (both source and binary)? I'm all set up to do it.

I would not do the source but PDF's can be dumped into pycses/docs and should end up in the released packages.

@jmrohwer
Copy link
Member

What's your take on distributing a PDF of the documentation together with the release packages (both source and binary)? I'm all set up to do it.

I would not do the source but PDF's can be dumped into pycses/docs and should end up in the released packages.

Yes that was my idea.

@jmrohwer
Copy link
Member

Feel free to look into #62 here is my attempt from another project

try:
    from setuptools import setup

    install_requires_src = ['numpy', 'packaging', 'nose']
    extras_requires_src = {
        'sympy': ['sympy'],
        'glpk': ['swiglpk',],
        'sbml': ['python_libsbml', 'lxml',],
        'all': ['sympy', 'swiglpk', 'python_libsbml', 'lxml',],
    }
except:
    from distutils.core import setup

    install_requires_src = []
    extras_requires_src = {}

My question is, if you do setup with no argument does it install 'all' or none, as far as I can tell the later but would be good to confirm?

I would also say none, but I will test and confirm.

@bgoli
Copy link
Member Author

bgoli commented Aug 25, 2021

In that case make assimulo, libsbml and ipyparsing the individual options and "all" with all of them 👍 Leave the current required_options (currently defined in the setup.py arguments) as install_requires_src.

Of course you also need:

setup(
    ...
    install_requires=install_requires_src,
    extras_requires=extras_requires_src,
    ...
)

@jmrohwer
Copy link
Member

I've removed assimulo from the requirements so long. It also causes the RTD build to fail as RTD needs to install the modules in requirements.txt for the documentaiton build.

@jmrohwer
Copy link
Member

I tried this (just with the parscan option) and it seems to work. @bgoli would appreciate some testing 😄

Create a new clean virtual environment, say test (it must have access to your compilers) and activate it.
Inside the environment:

(test)$ pip install numpy
(test)$ pip install --no-cache-dir -i https://test.pypi.org/simple/ --extra-index-url=https://pypi.org/simple/ 'pysces[parscan]'

(The initial install numpy is needed otherwise the source build bombs out because there is no numpy on test.pypi; this happens even though --extra-index-url is specified). The other packages are then found on pypi.org. I have only uploaded a source bundle to test.pypi, but if it works (it does on my machine), then a binary bundle will work too.

Then in the shell in the virtual environment:

(test)$ ipcluster start

Open up another shell and activate the same virtual environment (needed to access the ipcluster), start ipython, and run in turn:

In [1]: import os
In [2]: import pysces
In [3]: pysces.PyscesModel.MODEL_DIR = os.path.join(pysces.__path__[0], 'pscmodels')
In [4]: testscript = os.path.join(pysces.__path__[0], 'examples', 'testparscanner.py')
In [5]: exec(open(testscript).read())

Let me know if it works 🤞

@jmrohwer
Copy link
Member

What I can confirm is that if pip install pysces is called without the extra argument, then no extra modules get installed (as expected).

@bgoli
Copy link
Member Author

bgoli commented Aug 25, 2021

Nice work 👍 I can see how this affects a source build but what about a binary release? One test would be if you build a single binary - say Windows Py3.8 - upload it to test.pypi and see if it makes any difference when doing a no-extra-argument pip install vs using an extra argument pip install into an empty environment (no compilers). If you upload the binary I will be happy to test it (I'll try with a local install in the meantime)

@bgoli
Copy link
Member Author

bgoli commented Aug 25, 2021

Let me know if it works 🤞

You may need to push your changes?

@jmrohwer
Copy link
Member

Let me know if it works crossed_fingers

You may need to push your changes?

I was referring more to the testing part on your side. I'll push if I know it's working.

I've been sorting out build issues here with manylinux. Getting late now. Will build the binaries tomorrow and upload to test.pypi. Did you try the pip install from source or don't you have compilers setup in a separate virtualenv?

@jmrohwer
Copy link
Member

jmrohwer commented Aug 26, 2021

@bgoli Okay, binary wheels for Python 3.8 and 3.9 for win_amd64 and for Python 3.9 manylinux2014 are on test.pypi. Would appreciate it if you could test as per this post #51 (comment)

To be absolutely sure, you may want to modify the pip install command to make sure it only gets binaries:

pip install --no-cache-dir --only-binary=:all: -i https://test.pypi.org/simple/ --extra-index-url=https://pypi.org/simple/ 'pysces[parscan]'

You should be able to install 'pysces[parscan]' to pull in ipyparallel, 'pysces[sbml]' to pull in python-libsbml, 'pysces[cvode]' to pull in assimulo (but there are no binaries on PyPI!), and finally 'pysces[all]' to get all of them. Just installing pysces should work as previously, not installing any of the extras. Finally, you could also install pysces[parscan,sbml] to get everything except assimulo.

EDIT: It appears under Windows the single quotes don't work, but you can leave them out entirely and just do pip install pysces[parscan] etc. However, double quotes also work. Under Linux it doesn't work without quotes, but both single and double quotes work. So I guess to have unified instructions, we should default to double quotes pip install "pysces[parscan]".

Would appreciate if you could actually also verify that the parallel scanning is working as per the above post.

@bgoli
Copy link
Member Author

bgoli commented Aug 26, 2021

Thanks. It works in quite a nice way, for wheels it creates custom entries in the setuptools metadata so the binary will be selected first, if available, and then the dependencies checked locally.

Requires-Dist: numpy
Requires-Dist: scipy
Requires-Dist: matplotlib
Requires-Dist: nose
Provides-Extra: all
Requires-Dist: ipyparallel ; extra == 'all'
Requires-Dist: assimulo ; extra == 'all'
Requires-Dist: python-libsbml ; extra == 'all'
Provides-Extra: cvode
Requires-Dist: assimulo ; extra == 'cvode'
Provides-Extra: parscan
Requires-Dist: ipyparallel ; extra == 'parscan'
Provides-Extra: sbml
Requires-Dist: python-libsbml ; extra == 'sbml'

Both these work as expected when installed into an environment that only has python, pip and ipython.

(tpip38c) pip install -i https://test.pypi.org/simple/ --extra-index-url=https://pypi.org/simple/ pysces
(tpip38d) pip install -i https://test.pypi.org/simple/ --extra-index-url=https://pypi.org/simple/ pysces[parscan]

So that all seems to work, I will try check the parallel parameter scan at some point. One suggestion, what about letting "cvode" also pull in libSBML, it is available for more platforms than assimulo and the chances are that if you want to deal with events you have an SBML file?

@jmrohwer
Copy link
Member

Okay thanks will push changes. One further question, what about Anaconda packages? A similar functionality is not (yet?) available for conda. See here:
conda/conda#3299
conda/conda#7502
conda/conda#2984

However, the conda dependency resolution is governed by the build recipe not the setup.py. So we could include the "optional" dependencies as full dependencies for a conda build. This is not a problem because all the packages are available for all platforms on conda-forge. If you are worried about the size of the install, the addition of ipyparallel is really minimal since most users would have IPython already in any case. Libsbml is a 4-5Mb download and assimulo anything between 2 and 6 Mb depending on OS. The upside would be that CVODE would be automatically enabled on a conda install.

Thoughts?

Of course the docs would have to be updated, but I can take care of that once we agree. I actually like this idea, but the alternative is of course also possible, i.e. users having to do conda install ipyparallel, conda install python-libsbml, conda install assimulo by hand.

@jmrohwer
Copy link
Member

So that all seems to work, I will try check the parallel parameter scan at some point. One suggestion, what about letting "cvode" also pull in libSBML, it is available for more platforms than assimulo and the chances are that if you want to deal with events you have an SBML file?

I'd prefer to have it separate, it's cleaner/neater. As long as it is properly documented. Any user doing this will be at a level where they have a reasonable idea of what they are doing... You may want CVODE (events) support without SBML, I had an MSc student doing events models but building them from scratch w/o SBML.

@bgoli
Copy link
Member Author

bgoli commented Aug 26, 2021

For Anaconda I assume everything is on conda-forge so I would say just add everything, size is not the issue but rather increasing the complexity of the dependencies. This is mostly relevant for automated builds/tests (but that is done with PyPI anyway).

Wouldn't your student be using Anaconda and getting everything anyway :-) I'm fine with things as they are for now.

(EDIT of course you could also do conda install -c conda-forge ipyparallel python-libsbml assimulo which is a single copy/paste but I would still just include everything in the Anaconda packages.)

@bgoli
Copy link
Member Author

bgoli commented Aug 26, 2021

@jmrohwer Seems to work ... will try with a slightly larger scan :-)

GATHER RESULT
-------------
Parallel calculation completed: 0 min 6 sec

PARSCANNER: 1056 states analysed
Total time taken:  0 min 6 sec
Duration: 6.19 seconds
States per second: 170.5

 Speedup with load balanced TaskClient: 0.40

Parallel execution...using RunScatter
parallel engine: ipcluster
MaxMode 2

PREPARATION
-----------
Generated ScanSpace: 0 min 0 sec
PARSCANNER: Tsteps 1056
Scattered ScanSpace on number of engines: 16
Preparation completed: 0 min 0 sec

PARALLEL COMPUTATION
--------------------
Parallel calculation completed: 0 min 1 sec

GATHER RESULT
-------------
Parallel calculation completed: 0 min 1 sec

PARSCANNER: 1056 states analysed
Total time taken:  0 min 1 sec
Duration: 1.78 seconds
States per second: 592.8

 Speedup with RunScatter: 1.38

===========
Comparing results...
serial vs. parallel s/s results     :  True
serial vs. parallel user output     :  True
TaskClient vs RunScatter s/s results:  True
TaskClient vs RunScatter user output:  True

@jmrohwer
Copy link
Member

@jmrohwer Seems to work ... will try with a slightly larger scan :-)

👍
I deliberately kept the scan small so that the test completes in a reasonable time. Of course with such a small scan there is no noticeable speedup as the overhead is too big compared to the gain in computing power.

@jmrohwer
Copy link
Member

For Anaconda I assume everything is on conda-forge so I would say just add everything, size is not the issue but rather increasing the complexity of the dependencies. This is mostly relevant for automated builds/tests (but that is done with PyPI anyway).

I have done a Linux build and tested it, all works as expected 👍
Windows building atm 😃
I will update the docs accordingly.

@bgoli
Copy link
Member Author

bgoli commented Aug 26, 2021

cool, probably in the docs but while I'm here is there an easy way to set the number of workers that are created?

@jmrohwer
Copy link
Member

cool, probably in the docs but while I'm here is there an easy way to set the number of workers that are created?

Not for the multiprocessing option. It just uses the mp defaults and creates a Pool equal to the number of CPU cores on your machine.

For ipcluster, you set the no. of nodes when creating the cluster:

ipcluster start --n=4
ipcluster --help

@jmrohwer
Copy link
Member

I have done a Linux build and tested it, all works as expected +1
Windows building atm smiley

Windows build also good. Pulls in everything with the dependencies as specified in the build recipe and setup.py left unchanged. I will thus now update the docs. I really like it that the anaconda packages set up a full working environment with a single one-line command 💯

@bgoli
Copy link
Member Author

bgoli commented Aug 26, 2021

Great 👍 To end with here are some more representative speedup results:

GATHER RESULT
-------------
Parallel calculation completed: 0 min 22 sec

PARSCANNER: 52800 states analysed
Total time taken:  0 min 22 sec
Duration: 22.53 seconds
States per second: 2343.3

 Speedup with load balanced TaskClient: 5.48

Parallel execution...using RunScatter
parallel engine: ipcluster
MaxMode 2

PREPARATION
-----------
Generated ScanSpace: 0 min 0 sec
PARSCANNER: Tsteps 52800
Scattered ScanSpace on number of engines: 16
Preparation completed: 0 min 0 sec

PARALLEL COMPUTATION
--------------------
Parallel calculation completed: 0 min 17 sec

GATHER RESULT
-------------
Parallel calculation completed: 0 min 18 sec

PARSCANNER: 52800 states analysed
Total time taken:  0 min 18 sec
Duration: 18.54 seconds
States per second: 2848.5

 Speedup with RunScatter: 6.66

===========
Comparing results...
serial vs. parallel s/s results     :  True
serial vs. parallel user output     :  True
TaskClient vs RunScatter s/s results:  True
TaskClient vs RunScatter user output:  True

@jmrohwer
Copy link
Member

@bgoli How is this coming on in terms of a release? Let me know if I can help with testing in any way. In terms of working on release-related issues, my time next week is quite limited (I basically have only Monday morning open) as I'm attending an online conference.

I can start drafting some 1.0 release notes. What are the most important features you would want to highlight? Obviously CVODE/Assimulo comes to mind, but a non-exhaustive list might be:

  • Support for CVODE as integrator via Assimulo under Python 3, which brings back support of events in models
  • Improved import and export of SBML
  • Automatic installation of optional dependencies with pip install "pysces[optional_dep]"
  • Extensive update of documentation
  • Numerous bug fixes

Anything else?

@bgoli
Copy link
Member Author

bgoli commented Sep 18, 2021

The release notes look good. There is one last issue I want to fix with the SBML export and then I will merge my development branch to main.

I picked up a few issues with the Stoichiometric analysis that needed attention but that will be tested in your CI builds and the SBML import/export can be tested as we go along. I hope to merge this weekend as next week is pretty full.

@jmrohwer
Copy link
Member

1.0.0 is out.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants