Skip to content

Conversation

@altendky
Copy link
Member

@altendky altendky commented Oct 4, 2018

#4073

WIP for:

  • Deciding between loop/sleep and signal.alarm() for py2 timeout
  • Ok to throw away all the kwargs (other than timeout) to runpytest_subprocess()?
  • Is it really ok to now react to the timeout kwarg which was previously ignored?
  • Stock checklist below

Thanks for submitting a PR, your contribution is really appreciated!

Here's a quick checklist that should be present in PRs (you can delete this text from the final description, this is
just a guideline):

  • Create a new changelog file in the changelog folder, with a name like <ISSUE NUMBER>.<TYPE>.rst. See changelog/README.rst for details.
  • Target the master branch for bug fixes, documentation updates and trivial changes.
  • Target the features branch for new features and removals/deprecations.
  • Include documentation when adding new features.
  • Include new tests or update existing tests when applicable.

Unless your change is trivial or a small documentation fix (e.g., a typo or reword of a small section) please:

  • Add yourself to AUTHORS in alphabetical order;

@codecov
Copy link

codecov bot commented Oct 4, 2018

Codecov Report

Merging #4078 into features will increase coverage by 0.04%.
The diff coverage is 95.65%.

Impacted file tree graph

@@             Coverage Diff              @@
##           features    #4078      +/-   ##
============================================
+ Coverage      94.5%   94.54%   +0.04%     
============================================
  Files           109      109              
  Lines         23920    23963      +43     
  Branches       2370     2375       +5     
============================================
+ Hits          22606    22657      +51     
+ Misses         1002      997       -5     
+ Partials        312      309       -3
Flag Coverage Δ
#doctesting 29.25% <8.69%> (+0.01%) ⬆️
#linux 94.4% <95.65%> (+0.04%) ⬆️
#nobyte 0% <0%> (ø) ⬆️
#numpy 28.16% <8.69%> (+0.01%) ⬆️
#pexpect 0% <0%> (ø) ⬆️
#py27 92.66% <84.78%> (+0.02%) ⬆️
#py34 92.1% <73.91%> (-0.01%) ⬇️
#py35 92.11% <73.91%> (-0.01%) ⬇️
#py36 92.68% <73.91%> (ø) ⬆️
#py37 92.31% <73.91%> (-0.01%) ⬇️
#trial 31.18% <17.39%> (+0.02%) ⬆️
#windows 93.86% <95.65%> (ø) ⬆️
#xdist 18.68% <8.69%> (-0.01%) ⬇️
Impacted Files Coverage Δ
testing/test_pytester.py 87.15% <100%> (+0.75%) ⬆️
src/_pytest/pytester.py 86.27% <93.33%> (+0.25%) ⬆️
src/_pytest/terminal.py 91.62% <0%> (+1.74%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 29d3fae...48dcc67. Read the comment docs.

Copy link
Member

@nicoddemus nicoddemus left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great initial work @altendky!

Please also add a CHANGELOG entry.

As I commented in the PR, I would rather avoid including the monotonic dependency if possible, but if we decide to keep it, we should add a trivial CHANGELOG entry mentioning the new dependency.

except subprocess.TimeoutExpired:
raise self.TimeoutExpired()
else:
end = monotonic.monotonic() + timeout
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we need to introduce a new dependency to get an accurate timeout here, TBH. I think using time.time() on Linux and time.clock() on Windows will give us good enough accuracy? To be clear I like having the two implementations separate like this, makes it easier to drop the Python 2-only code later, I'm questioning the external dependency to achieve that.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Assuming nobody goes changing the system clock during a test... sure. I expected resistance to the dependency but defaulted to using the 'proper' thing. (here anyways) Sure, I'll remove it.


remaining = end - monotonic.monotonic()
if remaining <= 0:
raise self.TimeoutExpired()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suggest that we add a message here with the command and how many seconds was the timeout.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense, will do.

args = self._getpytestargs() + args
return self.run(*args)

if "timeout" in kwargs:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a bit simpler:

return self.run(*args, timeout=kwargs.get("timeout"))

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given the verbosity of what I did it's probably worthwhile, but not quite the same. My way is independent of how the defaulting is handled in the called function. Yours depends on the default being triggered by passing None. But sure, will do.

assert testdir.runpytest_subprocess(testfile).ret == EXIT_OK


def test_testdir_run_timeout_expires(testdir):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

Copy link
Member

@asottile asottile left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oops forgot to press submit, some of this is duped from what @nicoddemus said so feel free to ignore those :)

setup.py Outdated
"attrs>=17.4.0",
"more-itertools>=4.0.0",
"atomicwrites>=1.0",
"monotonic",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is only used in python2.x and so it should be a conditional dependency

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I left it this way because then you can access monotonic consistently via monotonic.monotonic(), though sure, at this point we only use it for one thing and only in py2 (and it seems is going away anyways).

return popen

def run(self, *cmdargs):
def run(self, *cmdargs, **kwargs):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should this be timeout=None instead of allowing arbitrary kwargs? I notice they aren't checked so it could potentially allow .run(foo='bar') even when foo isn't used / would be a TypeError otherwise

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd love to do that, and did do that, but py2. This re-surprises me each time I find it.

https://repl.it/@altendky/pytest-devpytest4078-001

def f(*args, something=None):
    pass
Python 2.7.10 (default, Jul 14 2015, 19:46:27)
[GCC 4.8.2] on linux
   
Traceback (most recent call last):
  File "python", line 1
    def f(*args, something=None):
                         ^
SyntaxError: invalid syntax

But yes, if we are concerned about extras getting lost I can add some validation. I was maybe being a bit loose with it given elsewhere in the file there's a random **kwargs that is (was?) entirely unused.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh, here's keyword-only arguments in python2+3:

def f(**kwargs):
    arg = kwargs.pop('arg', None)
    if kwargs:
        raise TypeError('Unexpected arguments: {}'.format(', '.join(sorted(kwargs))))

@coveralls
Copy link

Coverage Status

Coverage decreased (-0.009%) to 93.802% when pulling fe7050b on altendky:4073-altendky-subprocessing_timeout into 29d3fae on pytest-dev:features.

@coveralls
Copy link

coveralls commented Oct 4, 2018

Coverage Status

Coverage decreased (-0.009%) to 93.803% when pulling 48dcc67 on altendky:4073-altendky-subprocessing_timeout into 29d3fae on pytest-dev:features.

@altendky
Copy link
Member Author

altendky commented Oct 4, 2018

@nicoddemus and @asottile, thanks for the interest and feedback. The main thing I dislike about this is rolling my own timeout. subprocess32 was recommended but I can't say I like the comment "It has not been tested by the author on Windows." Though they do apparently have it testing on AppVeyor so perhaps the comment should just be ignored. How do you feel about that as a dependency as compared with my super simplistic timeout?

@nicoddemus
Copy link
Member

Thanks @altendky for the quick response!

How do you feel about that as a dependency as compared with my super simplistic timeout?

I'm OK with the implementation you posted. 👍

@altendky
Copy link
Member Author

altendky commented Oct 5, 2018

@nicoddemus, also, there are several other functions that could expose and pass through a timeout parameter. Should they all? That would be my tendency. Tests for each or just the one that actually implements the timeout?

@nicoddemus
Copy link
Member

nicoddemus commented Oct 5, 2018

I would only add the timeout parameter to runpytest_subprocess() in addition to run(), as I believe those will be the ones where the timeout parameter will be more useful. We can add to other methods later as need arises.

@altendky
Copy link
Member Author

altendky commented Oct 5, 2018

Thanks @altendky for the quick response!

How do you feel about that as a dependency as compared with my super simplistic timeout?

I'm OK with the implementation you posted.

Too bad it's horrible. It waits 90% of the timeout even if the process finishes immediately. First option is just to have a (parameter-settable?) resolution for waiting. Maybe default to 0.1 seconds. Second option is maybe something with signal.alarm. Coding them both to see what they look like.

@altendky
Copy link
Member Author

altendky commented Oct 5, 2018

900cef6

So signal.alarm() just feels slick, in a really bad way. Too much global dependence and effect for my taste. I vote for the looping. How about you @nicoddemus and @asottile?

@asottile
Copy link
Member

asottile commented Oct 5, 2018

yeah probably going to need a loop anyway, there's no signals on windows :)

@asottile
Copy link
Member

asottile commented Oct 5, 2018

would it be the worst to only support this behaviour in python3+?

@altendky
Copy link
Member Author

altendky commented Oct 5, 2018

You think the code sitting there is going to cause enough trouble we don't want to deal with maintaining it? But in my case it's for pytest-twisted which is supporting 2.7 so I do want it, yes. :]

@asottile
Copy link
Member

asottile commented Oct 5, 2018

ah ok I meant more because this is going to be hard to get right / working on python2 (especially python2 windows) 😆

@altendky
Copy link
Member Author

altendky commented Oct 5, 2018

Do we really want to be throwing away all the (now other than timeout) kwargs to runpytest_subprocess()?

https://travis-ci.org/pytest-dev/pytest/jobs/437442250#L950

@altendky
Copy link
Member Author

altendky commented Oct 5, 2018

I'm almost satisfied. Just the two remaining questions left in the original message for tracking.

  • Ok to throw away all the kwargs (other than timeout) to runpytest_subprocess()?
  • Is it really ok to now react to the timeout kwarg which was previously ignored?

@nicoddemus
Copy link
Member

@altendky I think so, the patch looks good as is to me. 👍

Copy link
Member

@asottile asottile left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

otherwise looks great -- thanks for the patch!

break

remaining = end - time.time()
if remaining <= 0:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would write if time.time() > end: myself, but this is equivalent logic

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a leftover from when I actually used remaining elsewhere. Thanks for catching it.

@blueyed
Copy link
Contributor

blueyed commented Oct 5, 2018

@altendky

Is it just me or is Travis being weird? Seems to be cancelling one build and then starting another.

It cancels previous builds for new commits, assuming that you have pushed fixups. It is a Travis setting, enabled for pytest.

@altendky
Copy link
Member Author

altendky commented Oct 5, 2018

@blueyed thanks but I think it isn't that. Both builds are for the same commit and start at the same time as far as I can tell. The GitHub build links start out pointing at the canceled build.

https://travis-ci.org/pytest-dev/pytest/builds/437650065
https://travis-ci.org/pytest-dev/pytest/builds/437650097

@blueyed
Copy link
Contributor

blueyed commented Oct 5, 2018

Yeah, looks weird (same commit). Not sure. Likely related to this feature though I guess.

@altendky
Copy link
Member Author

altendky commented Oct 5, 2018

@blueyed there are a lot of duplicates including for other people as well, though some of them actually fully ran both builds instead of canceling.

https://travis-ci.org/pytest-dev/pytest/pull_requests

anyways...

@altendky altendky changed the title [WIP] Initial pass at timeout for subprocessing pytest Initial pass at timeout for subprocessing pytest Oct 5, 2018
@altendky altendky changed the title Initial pass at timeout for subprocessing pytest Add timeout for Testdir.runpytest_subprocess() and Testdir.run() Oct 5, 2018
@altendky
Copy link
Member Author

altendky commented Oct 6, 2018

I really don't like that we are already getting silly timeout errors.

https://travis-ci.org/pytest-dev/pytest/jobs/437814901#L959

An empty test is taking more than 10 seconds? Should I just massively extend these times so they don't fail for a minute or five or such?

@blueyed
Copy link
Contributor

blueyed commented Oct 6, 2018

@altendky
Is this because of Travis being slow sometimes?
Have not investigated into this PR, but shouldn't this failure also hide the internal traceback? (__tracebackhide__ = True)

@altendky
Copy link
Member Author

altendky commented Oct 6, 2018

@blueyed, good call, I forgot about that feature. Thanks. I'll review that when I get a chance.

@altendky altendky changed the title Add timeout for Testdir.runpytest_subprocess() and Testdir.run() [WIP] Add timeout for Testdir.runpytest_subprocess() and Testdir.run() Oct 6, 2018
@altendky
Copy link
Member Author

altendky commented Oct 7, 2018

@blueyed, I assume it was random slowness since only one job failed. Certainly we want the tests to pass quickly, but we can probably afford to occasionally wait a long time to avoid having flaky tests (so long as test integrity is maintained).

@altendky
Copy link
Member Author

altendky commented Oct 7, 2018

I forgot I had permissions to rebuild, and now it passes. So yeah, intermittent... :[ I hate time.

@altendky altendky changed the title [WIP] Add timeout for Testdir.runpytest_subprocess() and Testdir.run() Add timeout for Testdir.runpytest_subprocess() and Testdir.run() Oct 7, 2018
@altendky
Copy link
Member Author

altendky commented Oct 7, 2018

I'm not used to being able to merge things myself... Am I supposed to do that now? Wait for somewhere else to merge it? It's no matter to me, I can wait.

@nicoddemus
Copy link
Member

Oh you are free to merge after at least one reviewer has approved and there are no more comments to address. 😁

I would merge it now, but please do the honors. 🎉

@altendky altendky merged commit a6fb4c8 into pytest-dev:features Oct 7, 2018
@nicoddemus
Copy link
Member

Awesome, thanks @altendky!


start = time.time()
result = testdir.runpytest_subprocess(testfile, timeout=10)
result = testdir.runpytest_subprocess(testfile, timeout=120)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@altendky
Wouldn't that still fail with assert duration < 5 below?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unfortunately the build failure is gone, likely because it was restarted?! (would be good to copy'n'paste those in the future). (it was for py27-pluggymaster)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:[ Yes, I should have extended that as well. Or, really, made one number calculated from the other to avoid this error in the future.

How should I proceed? Push a new commit to the branch? Start a new branch?

And :[ again. I did double check that my link to the broken build was for a specific build. I didn't realize that Travis was going to reuse that number and effectively break the link. Lesson learned. sigh Don't restart builds, trigger a new build (empty commit, close/reopen PR, something).

@nicoddemus
Copy link
Member

How should I proceed? Push a new commit to the branch? Start a new branch?

As this has already been merged, I believe we need to start a new branch.

@blueyed
Copy link
Contributor

blueyed commented Oct 8, 2018

I think it is fine for now - just noticed this.
After all it should never take that long in the first place, and if it fails in the future we might be able to debug it from there due to the assertion (5s) failing then.

@altendky
Copy link
Member Author

altendky commented Oct 8, 2018

I'd rather fix it now than leave a wrong and confusing thing sitting around. Unless either of you want to discourage correcting it.

@nicoddemus
Copy link
Member

I'd rather fix it now than leave a wrong and confusing thing sitting around. Unless either of you want to discourage correcting it.

If you have the time, I personally prefer this to be fixed now. 😁

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants