Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

strange behaviour of sme running in a jupyter notebook #333

Closed
lkeegan opened this issue Oct 4, 2020 · 6 comments · Fixed by #374
Closed

strange behaviour of sme running in a jupyter notebook #333

lkeegan opened this issue Oct 4, 2020 · 6 comments · Fixed by #374
Labels
bug Something isn't working python Python library

Comments

@lkeegan
Copy link
Member

lkeegan commented Oct 4, 2020

On ubuntu, using sme v1.0.0 installed from pip on python 3.7, running this script:

import sys
print(sys.version)
import sme
print(sme)
m = sme.open_example_model()
r = m.simulate(100, 1)
print(r[1])

with python gives the correct output:

3.7.9 (default, Sep 28 2020, 09:46:13) 
[GCC 10.0.1 20200416 (experimental) [master revision 3c3f12e2a76:dcee354ce56:44
<module 'sme' from '/home/lkeegan/.pyenv/versions/3.7.9/lib/python3.7/site-packages/sme.cpython-37m-x86_64-linux-gnu.so'>
<sme.SimulationResult>
  - timepoint: 1.0
  - number of species: 5

however in a jupyter notebook with the same kernel and sme installation it gives (incorrect):

3.7.9 (default, Sep 28 2020, 09:46:13) 
[GCC 10.0.1 20200416 (experimental) [master revision 3c3f12e2a76:dcee354ce56:44
<module 'sme' from '/home/lkeegan/.pyenv/versions/3.7.9/lib/python3.7/site-packages/sme.cpython-37m-x86_64-linux-gnu.so'>
<sme.SimulationResult>
  - timepoint: 1.0
  - number of species: 0

and the script takes no time to run, so presumably the simulation is silently failing somehow.

If I compile sme locally with pip install . (using the same static libs that the CI release uses) then the jupyter notebook versions also works as expected:

3.7.9 (default, Sep 28 2020, 09:46:13) 
[GCC 10.0.1 20200416 (experimental) [master revision 3c3f12e2a76:dcee354ce56:44
<module 'sme' from '/home/lkeegan/.pyenv/versions/3.7.9/lib/python3.7/site-packages/sme.cpython-37m-x86_64-linux-gnu.so'>
<sme.SimulationResult>
  - timepoint: 1.0
  - number of species: 5

I reproduced the same issue reproduced with other versions.

But the online google colab notebook works (with python 3.6).

It might be an issue with my local jupyter setup - but at the very least there should be an error message from sme in this situation

@lkeegan lkeegan added bug Something isn't working python Python library labels Oct 4, 2020
@lkeegan
Copy link
Member Author

lkeegan commented Oct 18, 2020

same issue occurs on readthedocs when trying to execute getting_started notebook:

 File "/home/docs/checkouts/readthedocs.org/user_builds/spatial-model-editor/envs/367/lib/python3.7/site-packages/nbclient/client.py", line 740, in _check_raise_for_error
    raise CellExecutionError.from_cell_and_msg(cell, exec_reply['content'])
nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
------------------
my_compartment = my_model.compartment('Cell')
type(my_compartment)
------------------

�[0;31m---------------------------------------------------------------------------�[0m
�[0;31mInvalidArgument�[0m                           Traceback (most recent call last)
�[0;32m<ipython-input-1-6d8bfc7f6ff7>�[0m in �[0;36m<module>�[0;34m�[0m
�[0;32m----> 1�[0;31m �[0mmy_compartment�[0m �[0;34m=�[0m �[0mmy_model�[0m�[0;34m.�[0m�[0mcompartment�[0m�[0;34m(�[0m�[0;34m'Cell'�[0m�[0;34m)�[0m�[0;34m�[0m�[0;34m�[0m�[0m
�[0m�[1;32m      2�[0m �[0mtype�[0m�[0;34m(�[0m�[0mmy_compartment�[0m�[0;34m)�[0m�[0;34m�[0m�[0;34m�[0m�[0m

�[0;31mInvalidArgument�[0m: Compartment 'Cell' not found
InvalidArgument: Compartment 'Cell' not found

https://readthedocs.org/projects/spatial-model-editor/builds/12126860/

@lkeegan
Copy link
Member Author

lkeegan commented Oct 19, 2020

update:

  • this is specific to wheels built with the manylinux2010 docker image.
  • the manylinux1 wheels do not have this issue

lkeegan added a commit that referenced this issue Oct 19, 2020
  - add top level section "Python" to docs
  - add notebooks to docs/sme/notebooks
  - use nbsphinx sphinx extension
    - executes all ipython notebooks
    - converts them to rst
    - automatically add prologue to each notebook with colab link
  - add nbstripout to pre-commit hooks
    - removes all output from ipynb files before commit
  - resolves #344

also

  - improve passing of std::vectors to python
    - use opaque types & stl_bind to pass std::vector<SmeObject> by reference to python
    - resolves #368
  - stop building manylinux2010 CPython wheels due to #333
    - only build pypy wheels using manylinux2010 docker image
    - manylinux1 CPython wheels do not have this issue
  - pybind11 -> 2.6.0rc3
lkeegan added a commit that referenced this issue Oct 19, 2020
  - add top level section "Python" to docs
  - add notebooks to docs/sme/notebooks
  - use nbsphinx sphinx extension
    - executes all ipython notebooks
    - converts them to rst
    - automatically add prologue to each notebook with colab link
  - add nbstripout to pre-commit hooks
    - removes all output from ipynb files before commit
  - resolves #344

also

  - improve passing of std::vectors to python
    - use opaque types & stl_bind to pass std::vector<SmeObject> by reference to python
    - resolves #368
  - stop building manylinux2010 CPython wheels due to #333
    - only build pypy wheels using manylinux2010 docker image
    - manylinux1 CPython wheels do not have this issue
  - pybind11 -> 2.6.0rc3
lkeegan added a commit that referenced this issue Oct 19, 2020
  - add top level section "Python" to docs
  - add notebooks to docs/sme/notebooks
    - split getting started notebook
  - use nbsphinx sphinx extension
    - executes all ipython notebooks
    - converts them to rst
    - automatically add prologue to each notebook with colab link
  - add nbstripout to pre-commit hooks
    - removes all output from ipynb files before commit
  - resolves #344

also

  - improve passing of std::vectors to python
    - use opaque types & stl_bind to pass std::vector<SmeObject> by reference to python
    - resolves #368
  - stop building manylinux2010 CPython wheels due to #333
    - only build pypy wheels using manylinux2010 docker image
    - manylinux1 CPython wheels do not have this issue
  - pybind11 -> 2.6.0rc3
lkeegan added a commit that referenced this issue Oct 19, 2020
  - add top level section "Python" to docs
  - add notebooks to docs/sme/notebooks
    - split getting started notebook
  - use nbsphinx sphinx extension
    - executes all ipython notebooks
    - converts them to rst
    - automatically add prologue to each notebook with colab link
  - add nbstripout to pre-commit hooks
    - removes all output from ipynb files before commit
  - resolves #344

also

  - improve passing of std::vectors to python
    - use opaque types & stl_bind to pass std::vector<SmeObject> by reference to python
    - resolves #368
  - stop building manylinux2010 CPython wheels due to #333
    - only build pypy wheels using manylinux2010 docker image
    - manylinux1 CPython wheels do not have this issue
  - pybind11 -> 2.6.0rc3
lkeegan added a commit that referenced this issue Oct 19, 2020
  - add top level section "Python" to docs
  - add notebooks to docs/sme/notebooks
    - split getting started notebook
  - use nbsphinx sphinx extension
    - executes all ipython notebooks
    - converts them to rst
    - automatically add prologue to each notebook with colab link
  - add nbstripout to pre-commit hooks
    - removes all output from ipynb files before commit
  - resolves #344

also

  - improve passing of std::vectors to python
    - use opaque types & stl_bind to pass std::vector<SmeObject> by reference to python
    - resolves #368
  - stop building manylinux2010 CPython wheels due to #333
    - only build pypy wheels using manylinux2010 docker image
    - manylinux1 CPython wheels do not have this issue
  - pybind11 -> 2.6.0rc3
lkeegan added a commit that referenced this issue Oct 19, 2020
  - add top level section "Python" to docs
  - add notebooks to docs/sme/notebooks
    - split getting started notebook
  - use nbsphinx sphinx extension
    - executes all ipython notebooks
    - converts them to rst
    - automatically add prologue to each notebook with colab link
  - add nbstripout to pre-commit hooks
    - removes all output from ipynb files before commit
  - resolves #344

also

  - improve passing of std::vectors to python
    - use opaque types & stl_bind to pass std::vector<SmeObject> by reference to python
    - resolves #368
  - stop building manylinux2010 CPython wheels due to #333
    - only build pypy wheels using manylinux2010 docker image
    - manylinux1 CPython wheels do not have this issue
  - pybind11 -> 2.6.0rc3
lkeegan added a commit that referenced this issue Oct 19, 2020
  - add top level section "Python" to docs
  - add notebooks to docs/sme/notebooks
    - split getting started notebook
  - use nbsphinx sphinx extension
    - executes all ipython notebooks
    - converts them to rst
    - automatically add prologue to each notebook with colab link
  - add nbstripout to pre-commit hooks
    - removes all output from ipynb files before commit
  - resolves #344

also

  - improve passing of std::vectors to python
    - use opaque types & stl_bind to pass std::vector<SmeObject> by reference to python
    - resolves #368
  - stop building manylinux2010 CPython wheels due to #333
    - only build pypy wheels using manylinux2010 docker image
    - manylinux1 CPython wheels do not have this issue
  - pybind11 -> 2.6.0rc3
lkeegan added a commit that referenced this issue Oct 19, 2020
- add top level section "Python" to docs
  - add notebooks to docs/sme/notebooks
    - split getting started notebook
  - use nbsphinx sphinx extension
    - executes all ipython notebooks
    - converts them to rst
    - automatically add prologue to each notebook with colab link
  - add nbstripout to pre-commit hooks
    - removes all output from ipynb files before commit
  - resolves #344

also

  - improve passing of std::vectors to python
    - use opaque types & stl_bind to pass std::vector<SmeObject> by reference to python
    - resolves #368
  - stop building manylinux2010 CPython wheels due to #333
    - only build pypy wheels using manylinux2010 docker image
    - manylinux1 CPython wheels do not have this issue
  - pybind11 -> 2.6.0rc3
@lkeegan
Copy link
Member Author

lkeegan commented Oct 21, 2020

have now switched to only producing manylinux1 wheels:

@lkeegan
Copy link
Member Author

lkeegan commented Oct 21, 2020

enabling logging on the manylinux2010 python 3.8 wheel shows the imported sbml model is somehow being corrupted (only in jupyter notebook - in ipython this works fine):

[2020-10-21 13:32:03.250] [info] [model.cpp:47] Importing SBML from string...
[2020-10-21 13:32:03.253] [error] [validation.cpp:37] Errors while reading SBML file
[2020-10-21 13:32:03.253] [error] [validation.cpp:30] [SBML component consistency] line 264:3185 The value of the children of a <spatialPoints> object must be an array of values of type 'double'.
Reference: L3V1 Spatial V1 Section
 A <SpatialPoints> with id 'spatialPoints' has a compression type of 'uncompressed', but contains non-numeric elements.

[2020-10-21 13:32:03.253] [error] [validation.cpp:30] [SBML component consistency] line 269:0 The value of the children of a <sampledField> object must be an array of numeric values.
Reference: L3V1 Spatial V1 Section
 A <SampledField> with id 'geometryImage' has a compression type of 'uncompressed', but contains non-numeric elements.

@lkeegan
Copy link
Member Author

lkeegan commented Oct 21, 2020

This is caused by sme static linking libstdc++.

The reason for doing this was to try to make it more portable by not depending on the available libstdc++ on the system - which is (still, I think) fine for a standalone binary like the GUI.

But since sme is a library, it may be imported by another c++ binary (I guess in this case jupyter) which dynamically links to libstdc++, we can end up with a mix of old&new libstdc++ symbols in our process, and a call can resolve the wrong symbols.

in the above error the line ss >> a, where ss is a stringstream was failing, presumably due to incompatibility in stringstream between the statically linked libstdc++ and the dynamically resolved one

More generally, see this long discussion about an attempt to do the same thing for tensorflow wheels:

tldr: we should not static link libstdc++ for the python wheels

@lkeegan
Copy link
Member Author

lkeegan commented Oct 21, 2020

NB this probably also explains why manylinux1 wheels crashed on colab - there we are forced to statically link to get manylinux1 compatible wheels as we use a newer compiler.

Todo:

  • stop making manylinux1 wheels
    • they compile & import ok, but are fundamentally flawed and will randomly cause crashes
  • don't statically link standard libs for manylinux2010 wheels
    • this will resolve this issue

lkeegan added a commit that referenced this issue Oct 21, 2020
also

  - remove static linking of libstdc++ in manylinux2010 wheels
    - also remove manylinux1 wheels
    - resolves #333
  - use PYTHONMALLOC=malloc for ASAN CI python tests to reduce false positives
  - pybind11 -> 2.6.0
lkeegan added a commit that referenced this issue Oct 22, 2020
also

  - remove static linking of libstdc++ in manylinux2010 wheels
    - also remove manylinux1 wheels
    - resolves #333
  - use PYTHONMALLOC=malloc for ASAN CI python tests to reduce false positives
  - pybind11 -> 2.6.0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working python Python library
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant