Skip to content

[tests] Looking to fix hanging ci build due to pytest#1218

Merged
nemesifier merged 4 commits intomasterfrom
test-hanging
Feb 6, 2026
Merged

[tests] Looking to fix hanging ci build due to pytest#1218
nemesifier merged 4 commits intomasterfrom
test-hanging

Conversation

@nemesifier
Copy link
Member

Alternative approach to #1217.

@coderabbitai please do not review.

@nemesifier nemesifier self-assigned this Feb 5, 2026
@coderabbitai
Copy link

coderabbitai bot commented Feb 5, 2026

Walkthrough

This PR refactors the WebSocket consumer's _is_user_authenticated to return the user's authenticated state without closing the connection internally, preserving connect-level semantics that still close on failure. It adds pytest timeout and asyncio strict/debug settings to pytest.ini, installs pytest-timeout in test requirements, fixes minor indentation in a test file, and modifies the runtests script to run a dedicated Selenium test phase with adjusted flags.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

🚥 Pre-merge checks | ✅ 1 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Description check ⚠️ Warning The PR description is incomplete and lacks substantial detail. It only references an alternative approach without describing the actual changes, and does not follow the provided template structure. Add a proper description following the template structure, including a description of changes, test verification, and relevant checklist items.
✅ Passed checks (1 passed)
Check name Status Explanation
Title check ✅ Passed The title references fixing a hanging CI build due to pytest, which directly relates to the PR's changes that add test timeouts and update test configuration.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch test-hanging

📜 Recent review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between f38dbfa and 2177fea.

📒 Files selected for processing (3)
  • pytest.ini
  • requirements-test.txt
  • runtests
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
  • GitHub Check: Python==3.10 | django~=5.1.0
  • GitHub Check: Python==3.11 | django~=5.2.0
  • GitHub Check: Python==3.11 | django~=4.2.0
  • GitHub Check: Python==3.12 | django~=5.1.0
  • GitHub Check: Python==3.11 | django~=5.1.0
  • GitHub Check: Python==3.10 | django~=4.2.0
🔇 Additional comments (2)
runtests (1)

8-9: LGTM!

Good label correction — the echo now accurately describes the test phase being executed.

requirements-test.txt (1)

2-2: pytest-timeout addition looks good; also add pytest-asyncio here.

The pytest-timeout dependency correctly supports the new --timeout flag in pytest.ini. As noted in the pytest.ini review, pytest-asyncio should also be added here to support the asyncio_mode = strict and asyncio_debug = true settings.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Fix all issues with AI agents
In `@pytest.ini`:
- Line 2: The global test timeout in pytest.ini (addopts) is set to an
aggressive --timeout=5 which will cause false failures on slower CI runners;
update the addopts entry in pytest.ini to a safer default (e.g., replace
--timeout=5 with --timeout=30) or remove the global --timeout flag and instead
mark slow tests individually using `@pytest.mark.timeout`(...) on specific test
functions to avoid killing legitimate slower tests while retaining timeout
protection for hung tests.
- Around line 7-9: Add the missing pytest-asyncio test dependency so tests using
`@pytest.mark.asyncio` and pytest.ini settings (asyncio_mode = strict,
asyncio_debug = true, log_cli = true) run in CI; update the test requirements
file (requirements-test.txt) to include "pytest-asyncio" (pin a compatible
version if your project pins others) and ensure any CI test environment installs
that requirements file before running pytest.

In `@runtests`:
- Around line 17-18: Restore the commented coverage-export commands so coverage
data from the parallel test runs is merged and exported: re-enable "coverage
combine" and "coverage xml" (and any other commented blocks related to coverage
export) in the runtests script so the --parallel runs are combined into a
Cobertura XML for the CI "Upload Coverage" step; ensure the restored commands
run after both parallel test invocations and produce the expected XML artifact
for the Upload Coverage job.
🧹 Nitpick comments (3)
.github/workflows/ci.yml (2)

25-35: Ensure commented-out matrix entries and excludes are restored before merging.

Since this is a draft PR for debugging CI hangs, just a reminder that the removed Python 3.11/3.13 versions and the exclude block should be uncommented before this lands on master, otherwise CI coverage would silently shrink.


80-81: QA checks are disabled — remember to re-enable.

Same as above: the commented-out QA step should be restored before merge to avoid skipping lint/formatting checks on master.

runtests (1)

8-9: Selenium tests disabled — restore before merge.

Same as the other temporarily-disabled blocks in this PR.

📜 Review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 109f28e and 72f9287.

📒 Files selected for processing (7)
  • .github/workflows/ci.yml
  • openwisp_controller/config/base/channels_consumer.py
  • openwisp_controller/connection/tests/pytest.py
  • pytest.ini
  • requirements-test.txt
  • runtests
  • runtests.py
💤 Files with no reviewable changes (1)
  • runtests.py
🧰 Additional context used
🧠 Learnings (2)
📚 Learning: 2026-01-15T15:05:49.557Z
Learnt from: DragnEmperor
Repo: openwisp/openwisp-controller PR: 1175
File: openwisp_controller/config/management/commands/clear_last_ip.py:38-42
Timestamp: 2026-01-15T15:05:49.557Z
Learning: In Django projects, when using select_related() to traverse relations (for example, select_related("organization__config_settings")), the traversed relation must not be deferred. If you also use .only() in the same query, include the relation name or FK field (e.g., "organization" or "organization_id") in the .only() list to avoid the error "Field X cannot be both deferred and traversed using select_related at the same time." Apply this guideline to Django code in openwisp_controller/config/management/commands/clear_last_ip.py and similar modules by ensuring any select_related with an accompanying only() includes the related field names to prevent deferred/traversed conflicts.

Applied to files:

  • openwisp_controller/config/base/channels_consumer.py
  • openwisp_controller/connection/tests/pytest.py
📚 Learning: 2026-01-15T15:07:17.354Z
Learnt from: DragnEmperor
Repo: openwisp/openwisp-controller PR: 1175
File: openwisp_controller/geo/estimated_location/tests/tests.py:172-175
Timestamp: 2026-01-15T15:07:17.354Z
Learning: In this repository, flake8 enforces E501 (line too long) via setup.cfg (max-line-length = 88) while ruff ignores E501 via ruff.toml. Therefore, use '# noqa: E501' on lines that intentionally exceed 88 characters to satisfy flake8 without affecting ruff checks. This applies to Python files across the project (any .py) and is relevant for tests as well. Use sparingly and only where breaking lines is not feasible without hurting readability or functionality.

Applied to files:

  • openwisp_controller/config/base/channels_consumer.py
  • openwisp_controller/connection/tests/pytest.py
🧬 Code graph analysis (1)
openwisp_controller/connection/tests/pytest.py (3)
openwisp_controller/config/admin.py (2)
  • save (403-416)
  • save (1261-1270)
openwisp_controller/config/base/device.py (1)
  • save (270-288)
openwisp_controller/connection/base/models.py (3)
  • save (127-134)
  • save (375-380)
  • save (509-520)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
  • GitHub Check: Python==3.10 | django~=4.2.0
  • GitHub Check: Python==3.12 | django~=4.2.0
  • GitHub Check: Python==3.12 | django~=5.1.0
  • GitHub Check: Python==3.10 | django~=5.1.0
  • GitHub Check: Python==3.12 | django~=5.2.0
  • GitHub Check: Python==3.10 | django~=5.2.0
🔇 Additional comments (3)
requirements-test.txt (1)

2-2: LGTM — pytest-timeout addition is consistent with the new --timeout=5 in pytest.ini.

openwisp_controller/connection/tests/pytest.py (1)

47-53: Good fix — DB operations now execute within the SSH mock context.

Moving command.save() and command.refresh_from_db() inside the mock.patch("paramiko.SSHClient.exec_command") block ensures the background command execution triggered by save()_schedule_command() runs against the mock rather than a real SSH client, which is a likely root cause of the CI hang.

openwisp_controller/config/base/channels_consumer.py (1)

12-13: Good simplification — removes the redundant self.close() call.

The caller connect() (line 38-39) already closes the connection on auth/authz failure, so having _is_user_authenticated also close was a double-close that could contribute to hangs or unexpected behavior in the websocket lifecycle.

Minor nit: self.scope["user"].is_authenticated already returns a bool in modern Django, so is True is redundant — a plain return self.scope["user"].is_authenticated would suffice. Not blocking.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

@@ -1,6 +1,9 @@
[pytest]
addopts = -p no:warnings --create-db --reuse-db --nomigrations
addopts = -p no:warnings --create-db --reuse-db --nomigrations --timeout=5
Copy link

@coderabbitai coderabbitai bot Feb 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

5-second timeout is very aggressive for production CI.

A global --timeout=5 will kill any test that takes longer than 5 seconds, which is fine for hunting down hangs in a draft, but will likely cause false failures on slower CI runners or for tests involving real Redis, DB transactions, or websocket setup/teardown. Consider bumping this (e.g., --timeout=30) or using per-test @pytest.mark.timeout(...) for the known-slow tests before merging.

🤖 Prompt for AI Agents
In `@pytest.ini` at line 2, The global test timeout in pytest.ini (addopts) is set
to an aggressive --timeout=5 which will cause false failures on slower CI
runners; update the addopts entry in pytest.ini to a safer default (e.g.,
replace --timeout=5 with --timeout=30) or remove the global --timeout flag and
instead mark slow tests individually using `@pytest.mark.timeout`(...) on specific
test functions to avoid killing legitimate slower tests while retaining timeout
protection for hung tests.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@coderabbitai this is enough for us here.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@nemesifier, understood! I'll leave the timeout as is. 👍

pytest.ini Outdated
Comment on lines 7 to 9
log_cli = true
asyncio_mode = strict
asyncio_debug = true No newline at end of file
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

# First, explore the repository structure to find dependency files
git ls-files | grep -E '(requirements|setup|pyproject)' | head -20

Repository: openwisp/openwisp-controller

Length of output: 144


🏁 Script executed:

# Search for pytest-asyncio mentions across the repository
rg 'pytest-asyncio' --type py --type txt --type toml --type cfg -n

Repository: openwisp/openwisp-controller

Length of output: 99


🏁 Script executed:

# Check the contents of requirements files
find . -name 'requirements*.txt' -o -name 'setup.py' -o -name 'setup.cfg' -o -name 'pyproject.toml' | head -10

Repository: openwisp/openwisp-controller

Length of output: 154


🏁 Script executed:

# Check requirements-test.txt
cat -n requirements-test.txt

Repository: openwisp/openwisp-controller

Length of output: 331


🏁 Script executed:

# Check pyproject.toml
cat -n pyproject.toml

Repository: openwisp/openwisp-controller

Length of output: 734


🏁 Script executed:

# Check setup.py for test dependencies
cat -n setup.py

Repository: openwisp/openwisp-controller

Length of output: 2127


🏁 Script executed:

# Let's check if openwisp-utils repository is documented or available
# Search for any reference to pytest-asyncio in the codebase
rg 'asyncio' --type py --type txt -i -n

Repository: openwisp/openwisp-controller

Length of output: 277


🌐 Web query:

openwisp-utils channels-test pytest-asyncio dependency

💡 Result:

openwisp-utils does not list a dedicated channels-test extra (the documented extras are things like qa, rest, selenium, while test deps are installed separately via requirements-test.txt). [1]

If you’re running Django Channels tests with pytest, you must add pytest-asyncio (and typically pytest-django) because Channels’ async test helpers require an async-capable pytest plugin. [2]

Practical fix: add pytest-asyncio to your test requirements (e.g., requirements-test.txt) in the environment where you run Channels-based tests. [1][2]

Sources: [1] OpenWISP Utils developer installation/test deps, [2] Django Channels testing docs (pytest requires pytest-asyncio).


Add pytest-asyncio to test dependencies.

The pytest.ini settings asyncio_mode = strict and asyncio_debug = true require pytest-asyncio, which is not currently listed in requirements-test.txt. The assumption that it comes transitively from openwisp-utils[channels-test] is incorrect—that extra does not include pytest-asyncio. Since the codebase already uses @pytest.mark.asyncio decorators in tests, pytest-asyncio must be explicitly added to prevent CI failures.

🤖 Prompt for AI Agents
In `@pytest.ini` around lines 7 - 9, Add the missing pytest-asyncio test
dependency so tests using `@pytest.mark.asyncio` and pytest.ini settings
(asyncio_mode = strict, asyncio_debug = true, log_cli = true) run in CI; update
the test requirements file (requirements-test.txt) to include "pytest-asyncio"
(pin a compatible version if your project pins others) and ensure any CI test
environment installs that requirements file before running pytest.

runtests Outdated
Comment on lines 17 to 18
# coverage combine
# coverage xml
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Coverage reporting will be broken with coverage combine and coverage xml commented out.

The CI workflow's "Upload Coverage" step expects a Cobertura XML file. With these lines disabled, --parallel coverage data from the two test runs (lines 5–6 and 12–15) won't be merged or exported, so the coveralls upload will fail or report incomplete data. Fine for a draft, but these must be restored alongside the other commented-out blocks.

🤖 Prompt for AI Agents
In `@runtests` around lines 17 - 18, Restore the commented coverage-export
commands so coverage data from the parallel test runs is merged and exported:
re-enable "coverage combine" and "coverage xml" (and any other commented blocks
related to coverage export) in the runtests script so the --parallel runs are
combined into a Cobertura XML for the CI "Upload Coverage" step; ensure the
restored commands run after both parallel test invocations and produce the
expected XML artifact for the Upload Coverage job.

@nemesifier
Copy link
Member Author

@coderabbitai do not review

@coderabbitai
Copy link

coderabbitai bot commented Feb 6, 2026

Tip

For best results, initiate chat on the files or code changes.

@nemesifier: Understood! I won't review this PR. Feel free to mention me again if you need anything in the future.

@nemesifier nemesifier marked this pull request as ready for review February 6, 2026 00:24
@nemesifier
Copy link
Member Author

@coderabbitai review

@coderabbitai
Copy link

coderabbitai bot commented Feb 6, 2026

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

@coveralls
Copy link

Coverage Status

coverage: 98.656%. first build
when pulling 2177fea on test-hanging
into 109f28e on master.

@nemesifier nemesifier merged commit 2177fea into master Feb 6, 2026
21 checks passed
@nemesifier nemesifier deleted the test-hanging branch February 6, 2026 00:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants