Skip to content

Conversation

@icing
Copy link
Contributor

@icing icing commented Dec 16, 2025

When a test server is found or configured, do not silently ignore errors to start and disable them when checking their version.

This forces pytest to fail when a server is not operating as it should.

When a test server is found or configured, do not silently ignore
errors to start and disable them when checking their version.

This forces pytest to fail when a server is not operating
as it should.
@github-actions github-actions bot added the tests label Dec 16, 2025
@icing icing requested a review from vszakats December 16, 2025 09:39
@icing
Copy link
Contributor Author

icing commented Dec 16, 2025

@vszakats and @dfandrich : as to the passed vs skipped run of pytests. It prints a line like

====================== 592 passed, 135 skipped in 53.76s =======================

at the end. There is also a pytest-reportlog plugin that can produce a JSON file with all test outcomes at the end. This we could check. Maybe that would be better for ingestion into testclutch than parsing the raw output?

Not sure what is more convenient or better to use in testclutch.

@bagder bagder closed this in 3a485c2 Dec 16, 2025
@dfandrich
Copy link
Contributor

dfandrich commented Dec 16, 2025 via email

@icing
Copy link
Contributor Author

icing commented Dec 17, 2025

A machine-readable format would always be better, but usability of humans needs to be first. Maybe we can do both.

The default would be to store the JSON in a file. Can you then access that? Otherwise we could just cat it to stdout at the end.

@dfandrich
Copy link
Contributor

dfandrich commented Dec 17, 2025 via email

@icing
Copy link
Contributor Author

icing commented Dec 17, 2025

The pytest-reportlog plugin writes a file with lines such as:

{"nodeid": "tests/http/test_17_ssl_use.py::TestSSLUse::test_17_04_double_dot[http/1.1]", "location": ["tests/http/test_17_ssl_use.py", 131, "TestSSLUse.test_17_04_double_dot[http/1.1]"], "keywords": {"test_17_04_double_dot[http/1.1]": 1, "parametrize": 1, "pytestmark": 1, "http/1.1": 1, "TestSSLUse": 1, "test_17_ssl_use.py": 1, "http": 1, "tests": 1, "curl": 1, "": 1}, "outcome": "passed", "longrepr": null, "when": "setup", "user_properties": [], "sections": [["Captured log setup", "DEBUG    filelock:_api.py:294 Attempting to acquire lock 4497406912 on /Users/sei/projects/curl/tests/http/gen/gw1/ca/ca.lock\nDEBUG    filelock:_api.py:297 Lock 4497406912 acquired on /Users/sei/projects/curl/tests/http/gen/gw1/ca/ca.lock\nDEBUG    filelock:_api.py:327 Attempting to release lock 4497406912 on /Users/sei/projects/curl/tests/http/gen/gw1/ca/ca.lock\nDEBUG    filelock:_api.py:330 Lock 4497406912 released on /Users/sei/projects/curl/tests/http/gen/gw1/ca/ca.lock\nDEBUG    filelock:_api.py:294 Attempting to acquire lock 4490692464 on /Users/sei/projects/curl/tests/http/gen/ports.lock\nDEBUG    filelock:_api.py:297 Lock 4490692464 acquired on /Users/sei/projects/curl/tests/http/gen/ports.lock\nDEBUG    filelock:_api.py:327 Attempting to release lock 4490692464 on /Users/sei/projects/curl/tests/http/gen/ports.lock\nDEBUG    filelock:_api.py:330 Lock 4490692464 released on /Users/sei/projects/curl/tests/http/gen/ports.lock\nDEBUG    filelock:_api.py:294 Attempting to acquire lock 4501963264 on /Users/sei/projects/curl/tests/http/gen/ports.lock\nDEBUG    filelock:_api.py:297 Lock 4501963264 acquired on /Users/sei/projects/curl/tests/http/gen/ports.lock\nDEBUG    filelock:_api.py:327 Attempting to release lock 4501963264 on /Users/sei/projects/curl/tests/http/gen/ports.lock\nDEBUG    filelock:_api.py:330 Lock 4501963264 released on /Users/sei/projects/curl/tests/http/gen/ports.lock"]], "duration": 2.8718773560003683, "start": 1765879421.6158159, "stop": 1765879424.4875772, "$report_type": "TestReport", "item_index": 6, "worker_id": "gw1", "testrun_uid": "65231f3a01704750b2f8bafec7c26120", "node": "<WorkerController gw1>"}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Development

Successfully merging this pull request may close these issues.

3 participants