Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fast binary-cached build of clawpack dependencies #499

Merged
merged 6 commits into from
Jan 19, 2015

Conversation

ahmadia
Copy link
Member

@ahmadia ahmadia commented Jan 18, 2015

We should discuss this before merging it, as well as figure out what clawpack dependencies we want to put in.

@coveralls
Copy link

Coverage Status

Coverage remained the same when pulling d07d8ea on ahmadia/fast_hashdist into 81166c7 on master.

@coveralls
Copy link

Coverage Status

Coverage remained the same when pulling d07d8ea on ahmadia/fast_hashdist into 81166c7 on master.

@ahmadia
Copy link
Member Author

ahmadia commented Jan 18, 2015

Just a summary of what I've done:

  • Tried to build a Python Travis VM following the Chef cookbook: https://github.com/travis-ci/travis-cookbooks. This failed due to errors that I wasn't able to solve.
  • Installed an Ubuntu 12.04 server VM, installed the basic dependencies (git, gfortran, g++, gcc), then built a binary cache of the PETSc Clawpack dependencies in /home/travis/.hashdist/bld. Tarballed the built dependencies and uploaded them to my Dropbox account at: https://dl.dropboxusercontent.com/u/65439/pyclaw_hashdist_bld.tar.gz
  • Updated the version of hashstack we're using to something very recent. I left the version of hashdist untouched, but I should probably bump that at as well.

Everything looks like it's working, but I don't think I have a complete dependency list. I guess the idea would be to use hashdist now as the only test environment (don't run separate pyclaw/petclaw tests). I can do this, but I'd like to put together a list of all dependencies before we merge them.

Also, I suspect that the parallel tests will screw up the coverage output, so we should run the parallel tests first, then run the serial tests afterwards just to generate correct coverage scores.

@ahmadia
Copy link
Member Author

ahmadia commented Jan 18, 2015

@certik - You might be interested in seeing this. The binary cache of a 45 minute hashdist build takes about 2 seconds to download from Dropbox to Travis.

@ketch
Copy link
Member

ketch commented Jan 18, 2015

Wow, 4-minute builds on Travis. This is exciting! I think the only dependency you are missing is scipy (which is why tests are still being skipped). Here is my list:

  • gfortran
  • MPI
  • numpy
  • matplotlib
  • scipy
  • h5py
  • PETSc
  • petsc4py
  • coverage
  • python-coveralls
  • clawutil, visclaw, riemann

@certik
Copy link
Contributor

certik commented Jan 18, 2015

Nice!! We need to get binary builds officially supported.

Sent from my mobile phone.
On Jan 17, 2015 11:02 PM, "Aron Ahmadia" notifications@github.com wrote:

@certik https://github.com/certik - You might be interested in seeing
this. The binary cache of a 45 minute hashdist build takes about 2 seconds
to download from Dropbox to Travis.


Reply to this email directly or view it on GitHub
#499 (comment).

@ahmadia
Copy link
Member Author

ahmadia commented Jan 18, 2015

I still can't get over how fast the binary caches are, four seconds to get the entire scipy + petsc4py + h5py stack loaded on Travis.

@ahmadia
Copy link
Member Author

ahmadia commented Jan 18, 2015

I left off python-coveralls/coverage because I wasn't sure whether they were needed on the Travis side or our builds. I think we can safely just move the pip install coverage/coveralls commands to after we've exported the environment.

@coveralls
Copy link

Coverage Status

Coverage remained the same when pulling b80a2cc on ahmadia/fast_hashdist into 81166c7 on master.

@ahmadia ahmadia changed the title [WIP] Fast binary-cached build of hashdist dependencies Fast binary-cached build of clawpack dependencies Jan 18, 2015
@ahmadia
Copy link
Member Author

ahmadia commented Jan 18, 2015

Okay I think this is ready for review/merge assuming the 5-minute builds pass :)

@ahmadia
Copy link
Member Author

ahmadia commented Jan 18, 2015

By the way, it looks like there's a failing parallel 3-D test that the tests are exposing.

@ahmadia ahmadia self-assigned this Jan 18, 2015
@coveralls
Copy link

Coverage Status

Coverage remained the same when pulling a92710f on ahmadia/fast_hashdist into 81166c7 on master.

@ahmadia
Copy link
Member Author

ahmadia commented Jan 18, 2015

That didn't work, hold on.

@coveralls
Copy link

Coverage Status

Coverage remained the same when pulling ebb2c9c on ahmadia/fast_hashdist into 81166c7 on master.

@ahmadia
Copy link
Member Author

ahmadia commented Jan 18, 2015

@ketch - As far as I can tell, coverage tools are running correctly on the Travis side, but Coveralls is not reporting any coverage. I don't have more time to work on this project this week, and this is otherwise ready to merge. Can you take it from here?

@mandli
Copy link
Member

mandli commented Jan 18, 2015

Looks like something changed sizes in the parallel tests for some reason.

@ahmadia
Copy link
Member Author

ahmadia commented Jan 18, 2015

@mandli - I'm pretty sure that's a test that wasn't being run before. And that issue is probably easily fixed by correctly running the verification in parallel. We would have caught it sooner if we had a full testing stack running on Travis.

@rjleveque
Copy link
Member

Great! It would be nice if this worked on SageMathCloud too.

  • Randy

On Sun, Jan 18, 2015 at 12:36 AM, Ondřej Čertík notifications@github.com
wrote:

Nice!! We need to get binary builds officially supported.

Sent from my mobile phone.
On Jan 17, 2015 11:02 PM, "Aron Ahmadia" notifications@github.com
wrote:

@certik https://github.com/certik - You might be interested in seeing
this. The binary cache of a 45 minute hashdist build takes about 2
seconds
to download from Dropbox to Travis.


Reply to this email directly or view it on GitHub
#499 (comment).


Reply to this email directly or view it on GitHub
#499 (comment).

@ketch
Copy link
Member

ketch commented Jan 19, 2015

I had already detected this test failure and posted an issue:

#490

The problem is simply that our hdf5 writer does not work correctly in parallel.

I think Coveralls is not reporting coverage because the parallel tests are failing.

I think we should merge this PR and then fix that test afterward. The question I have is for @ahmadia: do you think it is a good idea to rely on your dropbox account for this? Should we upload the stack to clawpack.github.io or somewhere like that?

@ketch
Copy link
Member

ketch commented Jan 19, 2015

Also, @ahmadia it is probably best to just skip this test in parallel on Travis for now, so we don't merge a broken build.

@ahmadia
Copy link
Member Author

ahmadia commented Jan 19, 2015

I think we should merge this PR and then fix that test afterward. The question I have is for @ahmadia: do you think it is a good idea to rely on your dropbox account for this? Should we upload the stack to clawpack.github.io or somewhere like that?

If the Clawpack organization ends up getting some sponsored storage from Amazon, I would be happy to have the stack live there. It's okay on Dropbox for now.

The Dropbox account is on S3, and there's a good chance the Travis bots are spawning on EC2. Data transfer across S3 to an EC2 instance in the same region is actually free, but I couldn't tell you where my Dropbox storage or the Travis bots live. Otherwise, it's about 2 cents per GB. Given that the total hashstack build is about 150 MB, it works out to about 300 Travis builds per $1 of storage costs. I'm guessing we don't cost Dropbox/Travis more than $5 a year based on this off-the-cuff calculation. I have a professional Dropbox account, so hopefully I'm paying for this sort of thing anyway :)

Not quite a SkipTest, but this would be complicated to do with
the gen_variants framework right now, so I'm going to leave it.
@ahmadia
Copy link
Member Author

ahmadia commented Jan 19, 2015

I can't quite run a SkipTest, but petclaw is now disabled for the Sedov test.

@coveralls
Copy link

Coverage Status

Coverage remained the same when pulling 995c3b2 on ahmadia/fast_hashdist into 81166c7 on master.

ketch added a commit that referenced this pull request Jan 19, 2015
Fast binary-cached build of clawpack dependencies
@ketch ketch merged commit 09cd14f into master Jan 19, 2015
@ahmadia
Copy link
Member Author

ahmadia commented Jan 19, 2015

@ketch - thanks for merging, looks like coveralls/coverage is still not working correctly.

@ahmadia ahmadia deleted the ahmadia/fast_hashdist branch January 19, 2015 23:08
@ketch
Copy link
Member

ketch commented Jan 20, 2015

@ahmadia Yes, but that seems to have been broken for a while now. Probably since I turned on doctests. Actually, it was never completely working correctly. I will raise an issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants