-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add mechanism to select tests based on Pulp version #29
Comments
We could use both, skipTest() for runtime selection based on pulp version (ando/or some other condition?) and nose's attribute for everything else, ie. running core test, slow tests etc (including pulp version). This would give us more flexibility with selection of tests to run for the price of running them with nose. If for some reason we would like to run ie. 2.6 test on pulp 2.7, we could modify |
I think this can be done with a version selector. Let's say you have a Pulp 2.7 system and you'd like to run the 2.6 tests against it. You could accomplish that by setting the (currently imaginary) "version" attribute in the configuration file to "2.6": {
"default": {
"auth": ["admin", "admin"],
"base_url": "https://pulp.example.com",
"verify": false,
"version": "2.6"
}
} When you run tests, Pulp Smash will act as if pulp.example.com is running 2.6. It doesn't care that the system is running 2.7. |
I've started laying the groundwork for this. I've made several refactors to how the |
I'm wondering if there is a way to grab the Pulp Server version dynamically. If not possible then I think asking for the version on the configuration is the way to go. Once we have the version we can use standard python versioning library to allow checking versions using for example |
That'd be a very nice convenience, for sure. Even if we can fetch the version number at run-time, I think it'd be useful to be able to specify the version number directly in the configuration file. As the system under test, Pulp is not to be trusted - even the version number it returns may be incorrect. |
The current set of commits on master...Ichimonji10:version adds the ability to declare the Pulp version in the config file and to handle that version attribute via the |
@Ichimonji10 completely agree. EDIT: Karma given |
Some tests are only applicable to a particular Python version. Let's add some mechanism that allows users to run only tests which apply to a particular Pulp version or a range of Pulp versions. What should this look like?
One option is to use the
skipIf
decorator/function provided by unittest. It lets you write code like this:Of particular interest is that
self.skipTest()
can be executed at runtime. There will assuredly be cases where we want to determine whether to skip a test at run-time instead of being able to determine that in advance - maybe in asubTest()
context manager.Another option is to set attributes directly on test methods and use nose or nose2's test selection mechanism. See here. It lets you write code like this:
And you can then select tests from the command line, like so:
nose2 -v -A fast
.The upshot of using unittest's
skipIf
decorator/method and similar decorators/methods is that they're compatible with just about any test runner out there. You can run the test suite with unittest, unittest2, py.test, nose, nose2, etc, and the test suite will behave identically. In addition, using methods/decorators has a very small namespace footprint, and you get to choose which tests should be executed at run-time.The upshot of using something static like nose{,-2}'s test method attributes is that you get to easily select which tests should be run from the command-line. That's handy - I'd like to be able to say "run all tests that apply to Pulp 2.7 against my Pulp system" (and other similar statements?), and that's veeeery easy to do with nose{,-2}. (This use case is still doable with unittest's
skipIf
decorators/functions. It just may take a little more work - maybe the introduction of helper functions or another environment variable.)The text was updated successfully, but these errors were encountered: