Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run code coverage on v2 unit tests. #9919

Merged
merged 1 commit into from Jul 9, 2020
Merged

Conversation

asherf
Copy link
Member

@asherf asherf commented May 30, 2020

Problem

V2 pytest code coverage doesn't have enough tests so it is fragile.

Solution

Long term we should have tests, but for now, just enabling it and dogfooding it will expose issues

Result

Prevent regressions like: #9914

@asherf asherf force-pushed the jerry branch 3 times, most recently from eca362a to 4a9ae67 Compare May 31, 2020 01:38
@asherf
Copy link
Member Author

asherf commented May 31, 2020

it repos the bug I reported here: #9914
Screenshot 2020-05-30 19 17 32
Aborted CI, will rebase once a fix is merged.

@asherf
Copy link
Member Author

asherf commented Jun 15, 2020

02:42:31 [INFO] Starting: Run Pytest for tests/python/pants_test/backend/graph_info/tasks:sort_targets
02:42:31 [INFO] Starting: Run Pytest for tests/python/pants_test/cache:cache_setup
02:42:31 [INFO] Starting: Run Pytest for tests/python/pants_test/base:exception_sink
02:42:31 [INFO] Starting: Run Pytest for tests/python/pants_test/base:validation
02:42:31 [INFO] Starting: Run Pytest for src/python/pants/backend/pants_info:tests
02:42:31 [INFO] Starting: Run Pytest for src/python/pants/backend/python:tests
02:42:43 [INFO] Completed: Run Pytest for tests/python/pants_test/util:fileutil
02:42:43 [INFO] Starting: Run Pytest for src/python/pants/python:tests
02:42:43 [ERROR] 1 Exception encountered:
Engine traceback:
  in `test` goal
  in Run Pytest for tests/python/pants_test/util:fileutil
Traceback (no traceback):
  <pants native internals>
Exception: Failed to execute process: 9-FAILED_PRECONDITION: "Output type mismatch"
Traceback (most recent call last):
  File "/home/travis/build/pantsbuild/pants/pants.pex/pants/bin/local_pants_runner.py", line 356, in run
    engine_result = self._maybe_run_v2(v2)
  File "/home/travis/build/pantsbuild/pants/pants.pex/pants/bin/local_pants_runner.py", line 236, in _maybe_run_v2
    return self._maybe_run_v2_body(goals, poll=False)
  File "/home/travis/build/pantsbuild/pants/pants.pex/pants/bin/local_pants_runner.py", line 260, in _maybe_run_v2_body
    poll_delay=(0.1 if poll else None),
  File "/home/travis/build/pantsbuild/pants/pants.pex/pants/init/engine_initializer.py", line 256, in run_goal_rules
    goal_product, params, poll=poll, poll_delay=poll_delay
  File "/home/travis/build/pantsbuild/pants/pants.pex/pants/engine/internals/scheduler.py", line 525, in run_goal_rule
    self._raise_on_error([t for _, t in throws])
  File "/home/travis/build/pantsbuild/pants/pants.pex/pants/engine/internals/scheduler.py", line 489, in _raise_on_error
    wrapped_exceptions=tuple(t.exc for t in throws),
pants.engine.internals.scheduler.ExecutionError: 1 Exception encountered:
Engine traceback:
  in `test` goal
  in Run Pytest for tests/python/pants_test/util:fileutil
Traceback (no traceback):
  <pants native internals>
Exception: Failed to execute process: 9-FAILED_PRECONDITION: "Output type mismatch"
02:42:44 [INFO] Completed: Run Pytest for tests/python/pants_test/util:xml_parser
02:42:44 [INFO] Completed: Run Pytest for tests/python/pants_test/util:rwbuf
[=== 00:00 CI BEGINS ===]

@Eric-Arellano @stuhood any idea why this fails ?

@stuhood
Copy link
Sponsor Member

stuhood commented Jun 15, 2020

"Output type mismatch" sounds like perhaps declaring and output as a file that is actually a directory: perhaps this:

?

@asherf
Copy link
Member Author

asherf commented Jun 20, 2020

"Output type mismatch" sounds like perhaps declaring and output as a file that is actually a directory: perhaps this:

?

not sure what you mean... this works fine in other repos.. .coverage is a file (with the raw coverage results)

@Eric-Arellano
Copy link
Contributor

I think the unit tests are timing out with coverage enabled because coverage makes the tests much slower. Run ./v2 test --use-coverage tests/python/pants_test/util:objects, then ./v2 test tests/python/pants_test/util:objects, then ./v2 test --use-coverage tests/python/pants_test/util:objects --coverage-py-filter=pants.util.objects.

It’s very plausible that our plugin is responsible for most of the slowdown. Benjy’s changes mean that we will soon be able to remove the Pants plugin

@asherf
Copy link
Member Author

asherf commented Jun 23, 2020

ok. It is still super weird that it take that long... but I can try to re-push this PR once those changes are merged.

@stuhood
Copy link
Sponsor Member

stuhood commented Jun 23, 2020

I think the unit tests are timing out with coverage enabled because coverage makes the tests much slower.

If we have a decent sense for how much overhead coverage adds, we could automatically adjust/multiply timeouts under coverage...

@asherf
Copy link
Member Author

asherf commented Jun 23, 2020

It seems that src/python/pants/engine:tests is the one timing out, I'll run it locally with and without coverage to try to see what is the overhead.

@asherf
Copy link
Member Author

asherf commented Jun 23, 2020

Screenshot 2020-06-23 15 47 19

this specific test hangs when running locally/

@asherf asherf force-pushed the jerry branch 2 times, most recently from ffa4083 to 9b53089 Compare June 29, 2020 02:24
@asherf asherf force-pushed the jerry branch 2 times, most recently from bc5ba91 to 86f63e8 Compare July 8, 2020 18:06
@asherf
Copy link
Member Author

asherf commented Jul 8, 2020

@stuhood @Eric-Arellano this passes now.
I think this is worth merging.

@stuhood
Copy link
Sponsor Member

stuhood commented Jul 9, 2020

I think that I agree. It would be nice to send it somewhere, if possible? But ok to follow up there perhaps? @Eric-Arellano : thoughts?

@asherf
Copy link
Member Author

asherf commented Jul 9, 2020

I can follow up on sending it to coveralls.

@Eric-Arellano
Copy link
Contributor

I think it'd be great to have Coverage data overall. I'm wondering if we should first a) unify the integration and unit test shards as there is no reason to separate them with remote execution, and then b) turn on coverage for the whole project?

It may be misleading to only have coverage data from unit tests.

@stuhood
Copy link
Sponsor Member

stuhood commented Jul 9, 2020

I think it'd be great to have Coverage data overall. I'm wondering if we should first a) unify the integration and unit test shards as there is no reason to separate them with remote execution, and then b) turn on coverage for the whole project?

It may be misleading to only have coverage data from unit tests.

+1 to merging the unit/integration shards. But I'm not sure it blocks... depends whether coveralls/cover.io support receiving multiple files per SHA?

@asherf
Copy link
Member Author

asherf commented Jul 9, 2020

I think that I agree. It would be nice to send it somewhere, if possible? But ok to follow up there perhaps? @Eric-Arellano : thoughts?

#10299

@stuhood stuhood merged commit 5144ece into pantsbuild:master Jul 9, 2020
@asherf asherf deleted the jerry branch July 9, 2020 03:40
stuhood pushed a commit that referenced this pull request Jul 17, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants