-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Overhaul test suite with fixtures #2675
Draft
lunkwill42
wants to merge
32
commits into
Uninett:master
Choose a base branch
from
lunkwill42:test/fixture-experiments
base: master
Could not load branches
Branch not found: {{ refName }}
Could not load tags
Nothing to show
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Draft
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Kudos, SonarCloud Quality Gate passed!
|
1d4791a
to
b8a3582
Compare
Kudos, SonarCloud Quality Gate passed!
|
MacOS' System Integrity Protection wreaks havoc with NAV and its test suite, due to its dependency on pynetsnmp / NET-SNMP: pynetsnmp will load libcrypto using ctypes, which aborts the python process on MacOS with an error message like this: > WARNING: .tox/integration-py39-django32/bin/python is loading libcrypto > in an unsafe way When OpenSSL is installed on MacOS through homebrew, the environment variable DYLD_LIBRARY_PATH must be set for processes to load the correct version of libcrypto. However, newer versions of MacOS sanitizes this environment variable on many levels. The whole debaucle of trying to fix this in a sane way is described at https://hynek.me/articles/macos-dyld-env/ This workaround is no prettier, and will only work if libssl is installed using Homebrew, or if the value of DYLD_LIBRARY_PATH can be gleaned from LD_LIBRARY_PATH.
We've been on Python 3 for a good while now.
Instead of always start/stopping gunicorn at the start/end of an integration test session, this makes it a session fixture that tests can depend on directly.
The epollreactor2 implementation is no good on other platforms. Without this, the test suite would not run on MacOS, e.g.
This creates a PostgreSQL fixture that can be depended on by tests that require a running PostgreSQL server with the NAV database to be present. The fixture will work with externally set up PostgreSQL servers, like in the case of GitHub Actions, while it can also provision a PostgreSQL server with the NAV schema using Docker if run locally.
This ensures any tests that rely on the various pre-existing db-related fixtures will have a running PostgreSQL server available.
This removes the gunicorn setup from the functional test suite, and moves the newly introduces gunicorn fixture from the integration test suite to the top-level conftest.py, so it can be re-used across test-suites.
This ensures the postgres fixture is started for all tests that inherit from DjangoTransactionTestCase
Change TestCase implementations that should really be transactional test cases (which in turn ensures they have PostgreSQL available through the postgres fixture, and that transactions are properly rolled back after the tests have run)
Declares a dependency on the postgresql fixture to ensure the NAV database is actually available for these tests
This ensures that any test that depends on a logged in web client also automatically depends on having the NAV database running.
This ensure the snmpsim fixture can run both on Linux and MacOS, by replacing Linux specific parts (like looking at /proc/net/udp). This slightly changes the logic of verifying whether the snmpsimd process is up and running before yielding the fixture value: Instead of waiting for something to appear to listen to port 1024, this verifies the presence of the SNMP agent by performing an actual SNMP query.
Some pping tests require starting up the pping daemon and killing it after some time. These tests seem to rely on the external `timeout` program, which isn't always available on a system (e.g. it's not available on MacOS).
These imports made the entire test module depend on the PostgreSQL database actually running. Moving the imports to be local to the tests (that depend directly or indirectly) on the postgresql fixture ensures that the imports only take place after PostgreSQL is actually running.
The admin_navlet fixture is dynamic/parametrized, which is why its implementation is a bit weird. Unfortunately, the tests that depend on it currently fail under the new regime of the postgresql fixture. Adding this note so we know where to go back and fix it.
I accidentally discovered that several of the statemon tests were in fact broken. They expect pping code to use the process PID as a packet ID, and end up using this in their assertions. However, this is only sort-of true: pping actually uses the process PID modulo 2^16, since the value needs to fit in a 16-bit packet header field. If the tests ran on a system where the test runner PID is higher than 65536, these tests would fail for no apparent reason. This would rarely have been encountered, since the test suite most often runs inside newly created VMs or containers that are torn down immediately after the test run. This way, the test suite would almost always encounter very low PID values. Running the teste on my workstation with a high uptime was a different story, though. This more or less rewrites the tests to pytest format, so they can easily re-use a fixture to get the proper "PID" value.
While pytest can accomplish a lot of exciting things, it cannot use fixtures as input to test parametrization. While can make a test depend on a fixture for getting a postgresql instance, the test discovery phase that generates the tests cannot. I.e. we cannot get our test parameters from a database unless the database was already up and running when the test discovery phase started. Sad, but true,. This changes the navlet test to iterate over the available admin navlets itself.
While pytest can accomplish a lot of exciting things, it cannot use fixtures as input to test parametrization. While we can make a test depend on a fixture for getting a running webserver instance, the test discovery phase that generates the tests cannot. I.e. we cannot get our test parameters from the webcrawler unless the web server was already up and running when the test discovery phase started. Sad, but true. This changes the webcrawler into a fixture, and changes the web link reachability and HTML validation tests to iterate the pages provided by this session-scoped crawler. This also considerably shortens the discovery phase, since the crawling actually takes place during the running of the first test that uses the fixture.
We cannot get the NAV web service up and running without a working PostgreSQL instance.
We don't need these configuration variables from the environment any more, since the web server is started by a fixture and not an external tool like tox. Having them as fixtures would enable us to set them from the pytest command line down the road, if need be.
Add an initial reachability test in the gunicorn fixture itself. This is just to ensure the server is actually accepting requests, before we try to bombard it with them.
The python tidylib module loads the system library dynamically. This ensure we skip the HTML validation tests if tidylib isn't properly installed.
If nbtscan is not available (it is, after all, an optional dependency of NAV), netbiostracker.py cannot run, so we might as well skip it.
Also lock pyasn<0.5.0, for the same reasons stated in 72f241a
These tests would only work incidentally; i.e. if they ran after another test that depended on the postgresql fixture
The postgresql fixture is fine to depend on if you just need a database to be present. But if you need to write data during the test, you should probably use the db fixture.
`sudo -nv` can be used to verify whether the sudo command is prepared to execute commands non-interactively, i.e. it will not ask for a password. If the password has already been cached by previous sudo commands, or sudo is configured to give passwordless access, we can move on. Otherwise, interactive sudo or lack of sudo access cannot be used to run tests in an automated fashion.
These tests are supposed to be skipped if we CANNOT be root, not if we *can*.
At least two or three of NAVs dependencies may need to be built from source when pulled in by pip: psycopg2, python-ldap and Pillow. On platforms like NixOS, the include files for the necessary C libraries may not be in standardized FHS locations, so the C_INCLUDE_PATH is used to convey their locations to the C compiler.
For now, at least. If the Open-LDAP libraries aren't available locally, the test suite will fail because python-ldap will not build.
LDAP *is* marked as an optional dependency, so the test suite should still complete without it.
b8a3582
to
da5c940
Compare
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR has an overarching goal of making the test suite easier to run locally, rather then requiring devs to run all tests inside a very specifically defined Docker container (which is NOT how the test suite is run on GitHub actions).
Goals:
conftest.py
, such as gunicorn, PostgreSQL, Xvfb, while expecting that these are available in the local environment, these services should be defined as test fixtures. Any test that depends on PostgreSQL should declare a dependency to this fixture, in order for it to be able to use PostgreSQL.Some of the changes herein would probably stand to be factored out into separate PRs, as there are many fixes to the existing test suite that are not dependent on the fixture changes made.