Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Travis miniconda #1523

Merged
merged 8 commits into from Apr 1, 2014

Conversation

Projects
None yet
4 participants
@bashtage
Copy link
Contributor

commented Mar 27, 2014

Only one change, plus the miniconda Travis file. The sole change sets he backend to Agg which prevents any display-related errors on Travis.

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Mar 27, 2014

This failure was just a coveralls timeout.

@jseabold

This comment has been minimized.

Copy link
Member

commented Mar 27, 2014

I think we should go ahead and have Cython 0.20.1 across all builds. 0.19.0 (maybe 0.18.0 too) should work, but it's easier to just require the one that works across all Python versions I think. I updated our docs to reflect that you should use 0.20.1 but older version may work. The 0.17.x series definitely won't AFAIK.

@@ -28,7 +28,7 @@
# You can also deploy your own backend outside of matplotlib by
# referring to the module name (which must be in the PYTHONPATH) as
# 'module://my_backend'
backend : Qt4Agg
backend : Agg

This comment has been minimized.

Copy link
@jseabold

jseabold Mar 28, 2014

Member

I think this change will mess up our doc builds. I'll have to test it and see.

This comment has been minimized.

Copy link
@bashtage

bashtage Mar 28, 2014

Author Contributor

Might make sense to have a .doc and .travis version if that is the case. But I think it is just outputting graphics Agg should work.

This comment has been minimized.

Copy link
@jseabold

jseabold Mar 29, 2014

Member

We'll try it and see. Will be an easy fix and not the end of the world if the doc builds break. I just remember having to worry about this when I started them long ago.

@jseabold

This comment has been minimized.

Copy link
Member

commented Mar 28, 2014

This will close #1500. If you didn't know, you can add a 'closes #1500' to the commit message that does so and github will do it automatically and link the issue and the PR.

@bashtage bashtage closed this Mar 28, 2014

@bashtage bashtage reopened this Mar 28, 2014

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Mar 28, 2014

Oops. I did not know that, but I also didn't pay attention to the two different buttons.

@jseabold

This comment has been minimized.

Copy link
Member

commented Mar 28, 2014

@jseabold

This comment has been minimized.

Copy link
Member

commented Mar 28, 2014

I'm not sure we need to test all of these combinations. I also don't think we should have coveralls run in each instance. I think we should have one test that does our minimum requirements

Python  Numpy    Scipy    Cython  | Patsy  Pandas  Matplotlib
   2.6  1.6.2   0.11.0    0.20.1    0.2.0  0.12.0       1.3.0

Then maybe 2 more with the latest releases from conda packaging

   2.7  Latest Release    0.20.1  | Latest Releases
   3.3
   3.4  Latest Release    0.20.1  |  Latest release

I'm not sure we need 3.3 and 3.4 (or if 3.4 is available on Travis). I think maybe just 3.4, if it's possible. I'd also run coverage only on the 2.7 branch.

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Mar 28, 2014

For Python versions, I think all of 2.6, 2.7 and 3.3/3.4. I would also suggest that the accompanying packages for 2.6 should be generally older than those for 3.4, since if someone is on 3.4, you can be pretty sure they aren't worried about legacy support. Something like

2.6 with all minimums
2.7 two or three ways

  • conservative using similar/same as 2.6
  • moderate, say 1.7.x and 0.12.x
  • aggressive, using up-to-date packages

3.3/3.4 with most-up-to-date

If the no-2to3 patch gets in, then coverage will be identical except for things in compat.

It is important to test up-to-date since this is what someone who uses a binary installer will tend to get by default, and the only important difference will be whether the user is on 2.7 or 3.3/3.4. It also will show how subtle changes, like optimizer behaviour, affect things.

@jseabold

This comment has been minimized.

Copy link
Member

commented Mar 28, 2014

So...we mostly agree? 2.6 with minimum version requirements, the rest with the most recent releases? I don't see any reason to retest minimum requirements for 2.7. Maybe moderate for 2.7, but I think it's fine to assume that those packages have backwards compatibility and use the latest releases. We just want to make sure our code complies with minimum and isn't broken by changes on travis not that the intermediate versions are bug free I don't think.

I don't know well enough the changes between 3.3 and 3.4 to say whether it's redundant. My view on 3.X is always that if a user is using 3.3 then they'll probably upgrade to 3.4 when available. That said, 3.4 is brand new.

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Mar 28, 2014

I would guess that most people are on 2.7 now, so it might make sense to test more extensively there: something that would approximate 2.7 with typical platform-installed packages, and 2.7 using a binary installer.

I think once you get to 3.2 that pretty much similar.

@jseabold

This comment has been minimized.

Copy link
Member

commented Mar 28, 2014

Typical platform-installed packages is tough. That's why I'd say use whatever comes with conda or is available on pypi. We ubuntu testing elsewhere from their packaging system.

@jseabold

This comment has been minimized.

Copy link
Member

commented Mar 29, 2014

Let me know when (if) you have a chance to edit this. Need to rebase a few PRs on travis with new Cython. I can merge and then edit if that's preferable from your end.

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Mar 29, 2014

Happy to edit, but not sure which edits you are referring to? The target packages? Or scaling back the build config?

bashtage pushed a commit to bashtage/statsmodels that referenced this pull request Mar 29, 2014

@coveralls

This comment has been minimized.

Copy link

commented Mar 29, 2014

Coverage Status

Coverage remained the same when pulling f755e00 on bashtage:travis-miniconda into 3456ced on statsmodels:master.

@jseabold

This comment has been minimized.

Copy link
Member

commented Mar 29, 2014

Both. We need Cython >= 0.20.1 (might as well) and I think we only need 3-4 build configs. Your changes look good to me unless we want to add a 3.4 build into the mix assuming travis can be set up for it.

@josef-pkt what do you think? Look ok to merge?

@jseabold

This comment has been minimized.

Copy link
Member

commented Mar 29, 2014

Any idea about the build failure on 2.6?

@josef-pkt

This comment has been minimized.

Copy link
Member

commented Mar 29, 2014

coda didn't work for python 3

overall worth a try to merge this,
the version collection looks good to me, and it looks nice and simple enough

If we get conda specific test problems, we can switch back to something more "official".
(I'm not planning to figure out how conda works anytime soon.)

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Mar 30, 2014

There are a few issues here:

  1.   Segfault on Python 2.6, NumPy 1.6.2.  This will take a little bit more investigation on a local VM to see if I can replicate.  It may require some updates to work correctly
    
  2.   General failures on modern SciPy – These are driven by optimization changes which produce incorrect results for 3 tests
    
@josef-pkt

This comment has been minimized.

Copy link
Member

commented Mar 30, 2014

the python 2.6 log shows that scipy is not properly installed, missing scipy linalg files.
That looks independent of statsmodels.

to 2. gmm.poisson failure are known, I will go back to merging and looking at test failures tomorrow

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Mar 30, 2014

This I understand the issue – conda is for some reason installing a version that expects MKL even though it shouldn’t be.

@josef-pkt

This comment has been minimized.

Copy link
Member

commented Mar 30, 2014

general question on the setup:
Is it possible to mix conda and apt-get: use apt-get for the ones that are available on Travis (Ubuntu LTS) and conda for the newer versions?

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Mar 30, 2014

Unfortunately not – this said, it should be possible to have a Travis script that uses both separately (e.g. one on system python, one on conda).

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Mar 30, 2014

I didn't pay attention so I tried Pandas < 0.12, which produces test failures:

https://travis-ci.org/bashtage/statsmodels/jobs/21883086

So 0.12 must be a lower bound.

@josef-pkt

This comment has been minimized.

Copy link
Member

commented Mar 30, 2014

grouputils are still WIP, so it might not be sufficient reason to increase the pandas version requirement. There are currently several failures on Ubuntu and Debian across versions, and I still need to see what's noise and what might be serious.

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Mar 30, 2014

This PR results in failures, but these are to do with code not invovled in the PR. It is working as expected, save one small issue in conda

conda/conda#629

which requires MKL for Blas. When this bug is patched, it is simple to remove the mkl package from the list.

@josef-pkt

This comment has been minimized.

Copy link
Member

commented Mar 30, 2014

There still seems to be something wrong with python 3.3
No output has been received in the last 10 minutes, this potentially indicates a stalled build or something wrong with the build itself.
might indicate an endless loop

depending on the scipy version, endless loops were possible in optimization (I don't think there was one recently.)
depending on the version of Lapack/Blas in numpy, svd can hang on nan or inf (np.linalg.pinv)
or it can be specific to the conda build

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Mar 30, 2014

I think that is just Travis-induced randomness. Build 207.3 errored in the usual time:

https://travis-ci.org/bashtage/statsmodels

@josef-pkt

This comment has been minimized.

Copy link
Member

commented Mar 30, 2014

Good, this time results look good,
both 2.6 and 3.3 errors already have separate issues.

Kevin Sheppard added some commits Mar 23, 2014

Kevin Sheppard
ENH: Miniconda-based testing on Travis-CI
Experimental branch that uses miniconda rather than platform
python to test.  Has some advantages in that it is simple to reconfigure
alternative combinations of NumPy, SciPy, Cython and pandas easily,
at least as long as they are provided by Anaconda.
Kevin Sheppard
ENH: Alternative travis script that uses Anaconda via Miniconda
Provides an alternative method to test on Travis using Miniconda
that has some advantages from the current system.

- All binary, so no time spent building
- No branching in the execution steps
- Support for up-to-date requirements which are important to test

Also includes a small change to tools/matplotlibrc which changes the backend
to Agg to avoid Tk-related errors on Travis.  Agg is alays available and does not
depend on Qt or another toolkit.

Closes #1500
Kevin Sheppard
Fixed config for Python 3.3
Attempt to inspect segfault
Kevin Sheppard
Added 4th build configuration
New configuration will always be on the most current versions using Python 2.7
Fixed conda-related issue which allows for no MKL
@jseabold

This comment has been minimized.

Copy link
Member

commented Apr 1, 2014

Good to merge? Can work on test failures separately afterwards.

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Apr 1, 2014

It is ready.

@jseabold #1535 is trivial and will produce passes on 2.6.

The other failures are due ot the known issue with newer SciPy.

jseabold added a commit that referenced this pull request Apr 1, 2014

Merge pull request #1523 from bashtage/travis-miniconda
MAINT: Use miniconda on Travis.

@jseabold jseabold merged commit e675de8 into statsmodels:master Apr 1, 2014

1 check failed

default The Travis CI build failed
Details
@josef-pkt

This comment has been minimized.

Copy link
Member

commented Apr 1, 2014

build results look good, but in the latest coveralls for this PR there are no files.

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Apr 1, 2014

That is expected - it only affects two non-Python files.

@josef-pkt

This comment has been minimized.

Copy link
Member

commented Apr 1, 2014

the test suite still runs, and it should show the "all files" > 0

we can try rebase a PR on master and see if it triggers coveralls

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Apr 1, 2014

I see you are correct - I hadn't been looking at coveralls on this one since I know it wasn't changing Python code.

@josef-pkt

This comment has been minimized.

Copy link
Member

commented Apr 1, 2014

the include in .travis_coveragerc might be wrong now.
What's the path to the installed package?

@josef-pkt

This comment has been minimized.

Copy link
Member

commented Apr 1, 2014

also: there are 2 python 2.7 test runs, can we select coverage to run only on one of them?

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Apr 1, 2014

Eventually – but will need to use a different variable. I wanted to set one to =2 and the other to =2.7, but =2 selects 2.6 for some reason.

@bashtage

This comment has been minimized.

Copy link
Contributor Author

commented Apr 1, 2014

This is the issue – The rc has /usr/bin… when it has moved now to reside in anaconda. I am trying a patch that uses /statsmodels/ instead.

@josef-pkt josef-pkt added the PR label Apr 14, 2014

PierreBdR pushed a commit to PierreBdR/statsmodels that referenced this pull request Sep 2, 2014

PierreBdR pushed a commit to PierreBdR/statsmodels that referenced this pull request Sep 2, 2014

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.