-
Notifications
You must be signed in to change notification settings - Fork 275
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
test failures ... #339
Comments
shouldn't we start to fix the test suite? |
I'm also in favor of switching to pytest, and willing to spend some time doing so. Any objections? |
Thanks for the info. Comments:
|
if you don't need the test you can remove it. But sometimes a test is broken, we would like to keep it, but we can't fix it because it uses an external service which is down, or it is just not the module we feel responsible for. By skipping it, we don't block the tests ... still have a reminder and the chance to fix it. pytest is calling the doctests ... don't know how to add the skip decorator here. |
We have needed to run or mock the services used in tests for a long time. How about we mark them as "remote", setup travis to run "-m remote" and "-m not remote" and allow failures on the "-m remote" tests? |
... I like the "remote" tag ... does someone want to make an example? |
Here is a simple example: The results should look like:
Note that I'm mostly a fan of changing the tests in the tests/doctests directory to normal tests. I find them easier to maintain (eg codecompletion/highlighting). It also gives more possibilities. Doctests are fine for checking that the examples in documentation still work fine, but IMHO they should not be used for unit tests. Note that in the example commit I dropped quite some options from tox to make the example clearer. Don't consider this a pull request. We should definitely keep using the old tests until they are replaced. |
I also prefer to use normal tests instead of doctests. We could do this for new tests and have a slow transition over the time. I manly use the WPS module ... so I could care about the WPS test suite. To get a green light on travis we need to figure out how to add a pytest marker to those tests which are not working ... or we need to comment, remove or replace them. |
Maybe an even better way would be using pytest.mark.skipif after a basic test:
Which would lead to:
|
|
I have rewritten a doctest using pytest skipif: Try with:
Previous doctest: Still I don't know how to skip doctests, see links:
Shall we start to rewrite the tests? Should they go into a new test subfolder? For those (failing) tests we don't want to rewrite ... but still keep them for information ... should we comment them and print a reminder? |
As @tomkralidis seems to be the current maintainer, it would be nice to hear his opinion. I'm pro migrating tests to the new system. For the currently failing tests I think we should find out whether that's temporary or replace them with different servers. Flagging online tests is useful for packaging in Debian. There it is required that a package builds without online access (which means I excluded tests in the build process). For owslib one could argue that not going online is a bit weird (as the goal of the library is using online services), so skipping just the tests for sites that are unavailable seems good as well. |
Hi all: thanks for the valuable analysis/assessment. +1 with moving tests to normal tests. Or course we need to ensure that all remote tests are not always all skipped (i.e. shifting the problem elsewhere). As we know the root cause of the failures are the nature of the goal of OWSLib :) Are there any thoughts on using mocks? This would require some work to setup mocks but then we have no dependency on external resources, ever. I'm willing to sprint on this if others are. Having said this, I'm +1 on moving to normal tests regardless. Can we move with that as a first step? |
I have made a PR #407 with some converted doctests. I personally feel responsible for the wps module ... that is the one I'm currently using. When the doctests conversion is accepted I would convert the WPS tests as my next step. I can also look into mocking a WPS service. |
Thanks @cehbrecht. I'll commit to the remaining CSW and ISO tests. |
To get a green light on travis again I have moved broken tests to the We have also started to convert tests to "normal" tests and make use of There is also pytest flaky extension to skip test which sometimes do not work. Might help us on the transition. There is more work waiting to get the test-suite in good shape again. |
I close this one for now. Tests are working most of the time. There is more work to do but this can happen continuously. |
most of the time there are tests which fail due to not available online resources. This makes it hard to see if a patch causes a failure ... you need to check all the tests.
When using pytest one can mark for example long running tests as "slow" or those with external resources as "online":
http://docs.pytest.org/en/latest/mark.html
Tests which are currently broken and can not be fixed could be skipped:
http://docs.pytest.org/en/latest/skipping.html
Should we start using these markers?
The text was updated successfully, but these errors were encountered: