New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Making ramp_workflow pip installable #50
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, I was also working on this (I thought I had mentioned that), but did not yet do the requirements parsing of starting kits, so this is better
- source install_requirements.sh | ||
- pip install codecov | ||
- pip install -q flake8 | ||
- pip install -r testing-requirements.txt |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
since we are using conda envs, let's install with conda here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the command ? I don't use conda..
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok, bad idea. If a single library is not installable by conda the whole thing crashes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe use an environement.yml
file then (example here) that can be used to update an existing conda environment and can specify both conda and pip packages ?
Also as a side note, I find https://github.com/astropy/ci-helpers very useful for setting up continuous integration.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Which one is not installable? All in that list are for me installable with conda
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, but that is probably because I have conda-forge channel added. OK, then leave it be.
Regarding jupyter, what is this used for? Is it only for nbconvert? (then can also only install that)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@rth How do you use an environment.yml
to specify pip packages ?
IMO ci-helpers
is quite a big machinery for pure python packages. However we may use it in the future, especially for the very simple way to define build matrices.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, jupyter is only for nbconvert, I only call it in the backend to generate the page on ramp.studio. So it should be tested in the travis of the kit, but not in rampwf.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@aboucaud Well one could do, conda env create [or update] -f environement.yml
where, environement.yml
looks like, e.g.,
name: test-env
channels:
- defaults
- conda-forge
dependencies:
- python=3.5
- numpy=1.13.1
- scipy=0.19.1
- conda_package_1
- conda_package_2
- pip :
- pip_package_1
- ./ # install the current package
but in the end it's not that different from just running conda install and pip install separately..
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems convenient indeed. I'll keep that in mind.
'Operating System :: MacOS'], | ||
install_requires=[ | ||
'numpy>=1.13', | ||
'scipy>=0.19', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
scipy is not needed I think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
indeed, but scikit-learn expects it and it is not installed as dependency
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe better to read from requrements.txt
here? If you install scikit-learn with conda, scipy will get installed... Also, joblib is imported but is not in the requirements...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree its a duplication. But I don't know what is best regarding pypi.
Concerning joblib, I moved the imports to get rid of the dependency.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's leave them separate, the small duplication is not that bad. There is some slight ideological difference between requirements.txt (deploy specific versions) and install_requires (specify minimum dependencies the deployed environment should adhere to). See https://packaging.python.org/discussions/install-requires-vs-requirements/
setup.py
Outdated
install_requires=[ | ||
'numpy>=1.13', | ||
'scipy>=0.19', | ||
'scikit-learn>=0.18'], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
although pandas is not a strict requirement (it is currently not directly imported), it is still used a lot in the code (code that assumes pandas data structures are used), so therefore would add it here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fair enough
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what should be the version required ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, not sure. Let's maybe take 0.19 for now ?
(actually, we should in principle also test for all minimal dependencies, but can be for other PR)
I also think it might be better to split the requirements parsing into another PR, so the actual setup.py updates can already be merged (or, for now still install some of the extra deps in travis.yml until all starting_kits have gotten a requirements file) |
I pushed a |
2602163
to
bf1505e
Compare
Only the python2 build fails because of the following import error
This is specific to tensorflow and could maybe be solved easily by installing the Any idea @kegl ? |
Have never seen it. Pinging @mehdidc :) |
Codecov Report
@@ Coverage Diff @@
## master #50 +/- ##
==========================================
+ Coverage 92.13% 92.14% +0.01%
==========================================
Files 51 51
Lines 1195 1197 +2
==========================================
+ Hits 1101 1103 +2
Misses 94 94
Continue to review full report at Codecov.
|
Strange thing is that |
Mmmmh I find several complains online regarding Both libraries versions seem to be tightly linked so what I can think of is:
|
@aboucaud in pollenating insects 3, we use tensorflow 1.3.0 and protobuf 3.4.0 |
The MNIST ramp-kit build and the ramp-workflow build are identical but one fails and the other passes..
Now I'm really confused.. 😩 |
Maybe try to list protobuf explicitly in the requirements for MNIST ? (don't know why that would matter, but in all the issues/questions about this issue online I saw a lot of 'solution' (like just reinstalling) for which I don't know why it would matter ...) |
At this point I really don't understand the difference between the two builds (MNIST alone or called by the ramp-workflow tests) on python2.7 |
Comaparing the two last builds, there is a difference in the protobuf version (3.3 vs 3.4) |
Also, to really see whether the same works on MNIST, I think we need to adapt the travis.yml file on MNIST to use the requirements.txt file |
I tried this change this morning to see if that specific version was troubling. Apparently not.
I agree that's what we should do, but it also means taking care of installing the proper requirements for ramp-workflow as long as this PR is not merged. |
Trying that in ramp-kits/MNIST#3 |
7cfdd0d
to
5307547
Compare
@aboucaud you can remove that last commit again. I just tried to make it even more comparable to how the env is set up in the MNIST repo, but also this attempt still fails with the same problem. |
This PR intends to clean up the dependency tree of ramp_workflow.
The idea is to trim off all dependencies related to a particular ramp-kit.
The ramp-kits will therefore need to have a specific requirement.txt file listing the dependencies, which will be installed then using pip or conda.