Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make it possible to filter the wdspec tests to run for a given file #7923

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

carlosgcampos
Copy link
Contributor

@carlosgcampos carlosgcampos commented Oct 20, 2017

Currently it's only possible to run all the tests for a given test file.
This patch makes it possible to pass an expression to filter which test
to run for a given file, using the pattern test-file.py::expression.
It uses the -k pytest option, passing the provided
expression to pytest.


This change is Reviewable

Currently it's only possible to run all the tests for a given test file.
This patch makes it possible to pass an expression to filter which test
to run for a given file, using the pattern test-file.py::expression.
It uses the -k <expression> pytest option, passing the provided
expression to pytest.
@w3c-bots
Copy link

Build PASSED

Started: 2017-10-20 15:31:28
Finished: 2017-10-20 15:54:24

View more information about this build on:

@jgraham
Copy link
Contributor

jgraham commented Oct 23, 2017

Reviewed 4 of 4 files at r1.
Review status: all files reviewed at latest revision, 1 unresolved discussion, all commit checks successful.


tools/wptrunner/wptrunner/testloader.py, line 346 at r1 (raw file):

            self.manifest.set_defaults()

        self._test_filters = defaultdict(lambda: None)

I think this should be defaultdict(list), we should .append the filters and when passing to pytest we should " or ".join(filters)


Comments from Reviewable

@carlosgcampos
Copy link
Contributor Author

@jgraham do you think it's worth it? with current patch the user can provide any expression as filter, so you could simply do test.py::test_foo or test_bar. Making it possible to do the same also by passing test.py::test_foo test.py::test_bar only makes the code more complex. Note that you can provide any python expression to pytest using -k switch, not only or expressions.

@jgraham
Copy link
Contributor

jgraham commented Oct 24, 2017

You can, but the argument parsing to get that to work would be weird since it interprets space-separated things as different paths. I think it's worth making it work with the model that what you pass in is a list of paths/test names rather than a random python expression.

@carlosgcampos
Copy link
Contributor Author

It's as easy as using "test.py::test_foo or test_bar", at least in unix. Note that we can't really use test_names, because AFAIK the only way to filter the tests to run with pytest (apart from using marks) is using the -k switch, and it doesn't receive test names, but an expression. I think it's important that the user understand that what we pass is not a test name, but an expression that is passed directly to pytest. to avoid confusions. If a file contains test_foo and test_foo_with_bar, when passing test_foo, test_foo_with_bar will also be run. We need to use a different expression "test_foo and not test_foo_with_bar" for only test_foo to be run. We don't know that, so we can't generate expressions to ensure the user can pass test names.

@jgraham
Copy link
Contributor

jgraham commented Oct 24, 2017

Sure, but I feel like this is coupling things rather tightly to the specific implementation in pytest. One can at least imagine other test types allowing similar filtering, and in that case they wouldn't necessarily be python expressions. I'm sure we can get pytest to support this somehow since it will allow installing custom hooks &c.

@jgraham
Copy link
Contributor

jgraham commented Oct 24, 2017

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants