Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What determines when coverage.io and coveralls is updated? #15191

Closed
pkofod opened this issue Feb 22, 2016 · 23 comments
Closed

What determines when coverage.io and coveralls is updated? #15191

pkofod opened this issue Feb 22, 2016 · 23 comments

Comments

@pkofod
Copy link
Contributor

pkofod commented Feb 22, 2016

Looking at #11885 and at the README.md badges I was wondering what fires the coverage-services? Both are 1-2 months old as far as I can see, so I expect the numbers to be a bit out-dated due to all the commits since then.

So, out of curiosity: when / how / why is it updated? Tried to search but couldn't find the answer.

@eschnett
Copy link
Contributor

It's updated automatically when you push to the Github repository. It's just currently broken; see e.g. JuliaCI/Coverage.jl#101 or JuliaLang/MbedTLS.jl#19.

@pkofod
Copy link
Contributor Author

pkofod commented Feb 22, 2016

Just what I needed to find - just didn't know where to look (JuliaCI). Thanks

@pkofod pkofod closed this as completed Feb 22, 2016
@eschnett
Copy link
Contributor

Does anybody know of an open (not-closed) issue tracking this?

@pkofod
Copy link
Contributor Author

pkofod commented Feb 22, 2016

I didn't see they were closed! Reopening.

@pkofod pkofod reopened this Feb 22, 2016
@eschnett
Copy link
Contributor

I just see that Coverage submission is working again. If something is outdated, then pushing any change to the respective repository should push results. Of course, this assumes that the Travis script contains the necessary magic.

@tkelman
Copy link
Contributor

tkelman commented Feb 23, 2016

Coverage runs for base don't get run on Travis, they run on the buildbot. However, like nightlies, they only trigger when all the test-running buildbots pass. The OSX buildbot has not had a passing build since Feb 1st, it has been timing out due to the LLVM 3.7 upgrade, #13412, etc. https://build.julialang.org/builders/build_osx10.9-x64?numbuilds=250

@ViralBShah do we have any ETA on getting better Mac hardware for that buildbot? Right now it's running on a VM of @staticfloat's, having more memory should help move things along.

@pkofod
Copy link
Contributor Author

pkofod commented Feb 23, 2016

Hm, you say "has not had a passing build since Feb 1st" and I can see that from your link, but codecov has not been updated for "2 months" https://codecov.io/github/JuliaLang/julia/commits and coveralls.io has not been updated since 03 jan 2016 https://coveralls.io/github/JuliaLang/julia . Maybe I misunderstood you.

@tkelman
Copy link
Contributor

tkelman commented Feb 23, 2016

This is just the latest in a series of buildbot problems, or broken webstack on master, etc. The previous issues have now been fixed, the osx issue is what's holding it up right this second.

@pkofod
Copy link
Contributor Author

pkofod commented Feb 23, 2016

Thank you for explaining. I was just curious about the current code coverage numbers. Makes contributing to the "lets get code cov to XX%" issues a bit easier.

@tkelman
Copy link
Contributor

tkelman commented Feb 23, 2016

I think you can run CoverageBase locally but I haven't done it myself. Someone else may have to chime in with advice while we get the buildbots fixed back up.

@kshyatt
Copy link
Contributor

kshyatt commented Feb 23, 2016

I have run CoverageBase locally on OSX and got a replutil.jl test failure. If you want to know more, hit me up.

@tkelman
Copy link
Contributor

tkelman commented Feb 23, 2016

ref #14985 #14986, that test is relatively new and probably needs adjusting to work under the coverage flags

@StefanKarpinski
Copy link
Member

@kshyatt: is that the threadcall error you mentioned? If so, please yell at me until I fix it 😁.

@tkelman
Copy link
Contributor

tkelman commented Feb 29, 2016

replutil, not threadcall. when inlining is off the results are different https://build.julialang.org/builders/coverage_ubuntu14.04-x64/builds/56/steps/Run%20non-inlined%20tests/logs/stdio

@vtjnash
Copy link
Member

vtjnash commented Mar 8, 2016

something between 9cda4ab and c6c6842 fixed the osx buildbot. the replutil no-inline error should be fixed by 8a54ed5 now.

the current issue (https://build.julialang.org/builders/coverage_ubuntu14.04-x64/builds/66/steps/Test%20CoverageBase.jl/logs/stdio) now seems to be in Requests.jl (JuliaWeb/Requests.jl#105)

@tkelman
Copy link
Contributor

tkelman commented Mar 8, 2016

I think it was a change on the buildbot itself rather than a change in Julia, but we've been limping along the past few days successfully making binaries again. It's still quite slow, but we have the order in for a new mac mini to run it on.

@pkofod
Copy link
Contributor Author

pkofod commented Jun 27, 2016

Sorry if the bump is annoying, but coverage numbers are still off. Is it something that is targeted to be fixed before the v0.5 release ? I know the coverage-tools don't explicitly add functionality, but it does seem quite important to know the extent to which the code is covered by tests. I know some people might be running these numbers locally or on their own repos, but it's not obvious how to find this information.

I tried to find the active issues regarding the failure to produce the correct numbers, but all issues or PRs I could find are either closed or merged, so it's not clear why it is failing currently.

Even if it might not be top priority, as key contributors might have an eye on the numbers on their own coveralls profiles, it might have an impact in the longer term with regards to attracting new contributors. At least the posts on contributing to Julia by Leah Hanson and Katherine Hyatt was what gave me the "courage" to even submit a PR to Julia in the first place. If I were to follow the guide in #11885 I would probably be quite confused :)

@tkelman
Copy link
Contributor

tkelman commented Jun 27, 2016

Bump is certainly called-for. There are a handful of issues for specific test failures when tests are run with --inline=no as they are for coverage runs, but I'm having trouble finding them at the moment. The current failure (see https://build.julialang.org/builders/coverage_ubuntu14.04-x64/builds/401/steps/Run%20non-inlined%20tests/logs/stdio for latest and https://build.julialang.org/builders/coverage_ubuntu14.04-x64?numbuilds=250 for older logs) looks like a timeout during bitarray. There was a change on master that made that test much slower when run with inlining disabled.

@timholy
Copy link
Member

timholy commented Jun 27, 2016

I assume we have control of the buildbots? Why not just increase the duration for the timeout?

@staticfloat
Copy link
Member

We could, but this is getting a little excessive. The timeout is set to trigger if there is no output from the process for an hour. That means that the bitarray coverage test by itself is taking over an hour.

I've bumped up the timeout to two hours, we'll see if it makes it through now.

@timholy
Copy link
Member

timholy commented Jun 27, 2016

Ah. That does seem long. test_showline suggests that this loop is much (or perhaps all) of the culprit. If I'm reading it correctly, the inner loop runs v1^2/6 times, on inputs that are of length v1, so it's of order v1^3 operations. A natural way to speed it up would be to decrease v1 from its current value of 260. Feasible, @carlobaldassi? I recognize that 64 is an important number when it comes to BitArrays, so maybe this won't work. What about iterating m1 and m2 more sparsely?

@vtjnash
Copy link
Member

vtjnash commented Jun 27, 2016

there's currently a MethodError occurring when it tries to upload the results: https://build.julialang.org/builders/coverage_ubuntu14.04-x64/builds/403/steps/Gather%20test%20results%20and%20Submit/logs/stdio

@vtjnash
Copy link
Member

vtjnash commented Jun 27, 2016

Since #16692, we consider all call-site devirtualization optimizations to be a subset of inlining. I suspect that may be too expensive for the bitarray test (or any other test that tests compiled optimizations) to finish in any amount of time. But, while this provided a curious (and completely unawares workaround for #265), this is probably much too aggressive.

@vtjnash vtjnash closed this as completed Jul 25, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants