Skip to content
Merged

Upd #62

Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 6 additions & 2 deletions .github/workflows/verify.yml
Original file line number Diff line number Diff line change
Expand Up @@ -145,11 +145,15 @@ jobs:
. venv/bin/activate
uv pip install -r requirements.txt -r requirements-test.txt
- name: Download all coverage artifacts
uses: actions/download-artifact@v4
uses: actions/download-artifact@v5
with:
pattern: coverage-*
merge-multiple: true
path: ${{ github.workspace }}/artifacts
- name: Combine coverage results
run: |
. venv/bin/activate
coverage combine coverage*/.coverage*
coverage combine artifacts/.coverage*
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Double-check coverage discovery after layout change

Combining via coverage combine artifacts/.coverage* should work with merged artifacts. Verify at least one .coverage file lands there.

Run this to confirm:


🏁 Script executed:

#!/bin/bash
set -euo pipefail
echo "Downloaded artifacts:"
ls -la artifacts || true
echo "Coverage files discovered:"
ls -la artifacts/.coverage* || { echo "No coverage files found under artifacts/"; exit 1; }

Length of output: 418


Coverage combine will fail — artifacts/.coverage not found*

I ran the verification script: the artifacts/ directory does not exist and no .coverage files were found, so the step will fail at runtime.

  • Location: .github/workflows/verify.yml — line 156: coverage combine artifacts/.coverage*
  • Evidence: script output showed "ls: cannot access 'artifacts': No such file or directory" and "No coverage files found under artifacts/"
  • Action items:
    • Ensure the job that produces coverage uploads the files (actions/upload-artifact) and this workflow downloads them (actions/download-artifact) into artifacts/.
    • Or update the glob/path to the actual location where coverage files are produced.
    • As a safety measure, guard the combine step or fail with a clear message if no coverage files are present.
🤖 Prompt for AI Agents
.github/workflows/verify.yml around line 156: the step runs "coverage combine
artifacts/.coverage*" but artifacts/ doesn't exist and no .coverage files are
present; update the workflow to either (A) download uploaded coverage artifacts
into artifacts/ using actions/download-artifact (ensure the producing job
uploads them with actions/upload-artifact), (B) change the glob to the actual
path where coverage files are produced, or (C) add a guard that checks for files
before running coverage combine and prints a clear error and exits non-zero if
none found; implement one of these fixes so the step never fails due to a
missing directory or missing coverage files.

coverage report --fail-under=85
coverage xml
- name: Upload coverage to Codecov
Expand Down
19 changes: 16 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ repos:
- --quiet
- --format=custom
- --configfile=tests/bandit.yaml
files: ^(airos|tests)/.+\.py$
files: ^(airos|tests|script)/.+\.py$
- repo: https://github.com/adrienverge/yamllint.git
rev: v1.37.1
hooks:
Expand All @@ -75,17 +75,30 @@ repos:
- --py39-plus
- --force
- --keep-updates
files: ^(airos|tests)/.+\.py$
files: ^(airos|tests|script)/.+\.py$
- repo: https://github.com/igorshubovych/markdownlint-cli
rev: v0.45.0
hooks:
- id: markdownlint
- repo: local
hooks:
- id: pytest
name: "pytest"
entry: script/run-in-env.sh pytest
language: script
types: [python]
pass_filenames: false
files: ^(airos|tests|script)/.+\.py$
- id: pylint
name: "pylinting"
entry: script/run-in-env.sh pylint -j 0
language: script
types: [python]
files: ^(airos|tests|script)/.+\.py$
- id: mypy
name: mypy
entry: script/run-in-env.sh mypy
language: script
require_serial: true
types_or: [python, pyi]
files: ^(airos|tests|scripts)/.+\.(py|pyi)$
files: ^(airos|tests|script)/.+\.(py|pyi)$
7 changes: 7 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,13 @@

All notable changes to this project will be documented in this file.

## [0.2.10] - 2025-08-13

### Changed

- Maintenance chores
- Added pylint and pytest (and applicable changes)

## [0.2.9] - 2025-08-12

### Changed
Expand Down
2 changes: 1 addition & 1 deletion airos/airos8.py
Original file line number Diff line number Diff line change
Expand Up @@ -193,7 +193,7 @@ async def login(self) -> bool:
_LOGGER.info("Login task was cancelled")
raise

def derived_data(self, response: dict[str, Any] = {}) -> dict[str, Any]:
def derived_data(self, response: dict[str, Any]) -> dict[str, Any]:
"""Add derived data to the device response."""
derived: dict[str, Any] = {
"station": False,
Expand Down
17 changes: 16 additions & 1 deletion airos/data.py
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ def _redact(d: dict[str, Any]) -> dict[str, Any]:
for k, v in d.items():
if k in sensitive_keys:
if isinstance(v, str) and (is_mac_address(v) or is_mac_address_mask(v)):
# Redact only the first 6 hex characters of a MAC address
# Redact only the last part of a MAC address to a dummy value
redacted_d[k] = "00:11:22:33:" + v.replace("-", ":").upper()[-5:]
elif isinstance(v, str) and is_ip_address(v):
Comment on lines +64 to 66
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Preserve IP family when redacting (IPv6 should redact to ::1)

Single-string IPs and lists of IP strings currently redact to 127.0.0.3 even for IPv6. Align with the new list-of-dicts behavior by preserving IPv4 vs IPv6 families.

Apply this diff:

-                elif isinstance(v, str) and is_ip_address(v):
-                    # Redact to a dummy local IP address
-                    redacted_d[k] = "127.0.0.3"
+                elif isinstance(v, str) and is_ip_address(v):
+                    # Redact to a dummy local IP address, preserving IP family
+                    redacted_d[k] = (
+                        "127.0.0.3"
+                        if ipaddress.ip_address(v).version == 4
+                        else "::1"
+                    )
@@
-                elif isinstance(v, list) and all(
-                    isinstance(i, str) and is_ip_address(i) for i in v
-                ):
-                    # Redact list of IPs to a dummy list
-                    redacted_d[k] = ["127.0.0.3"]  # type: ignore[assignment]
+                elif isinstance(v, list) and all(
+                    isinstance(i, str) and is_ip_address(i) for i in v
+                ):
+                    # Redact list of IPs to a dummy list, preserving each item's IP family
+                    redacted_d[k] = [
+                        "127.0.0.3" if ipaddress.ip_address(i).version == 4 else "::1"
+                        for i in v
+                    ]  # type: ignore[assignment]

Also applies to: 69-74

# Redact to a dummy local IP address
Expand All @@ -71,6 +71,21 @@ def _redact(d: dict[str, Any]) -> dict[str, Any]:
):
# Redact list of IPs to a dummy list
redacted_d[k] = ["127.0.0.3"] # type: ignore[assignment]
elif isinstance(v, list) and all(
isinstance(i, dict) and "addr" in i and is_ip_address(i["addr"])
for i in v
):
# Redact list of dictionaries with IP addresses to a dummy list
redacted_list = []
for item in v:
redacted_item = item.copy()
redacted_item["addr"] = (
"127.0.0.3"
if ipaddress.ip_address(redacted_item["addr"]).version == 4
else "::1"
)
redacted_list.append(redacted_item)
redacted_d[k] = redacted_list # type: ignore[assignment]
else:
redacted_d[k] = "REDACTED"
elif isinstance(v, dict):
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"

[project]
name = "airos"
version = "0.2.9"
version = "0.2.10"
license = "MIT"
description = "Ubiquity airOS module(s) for Python 3."
readme = "README.md"
Expand Down
1 change: 1 addition & 0 deletions requirements-test.txt
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,4 @@ aiofiles==24.1.0
radon==6.0.1
types-aiofiles==24.1.0.20250809
mypy==1.17.1
pylint==3.3.7
4 changes: 2 additions & 2 deletions script/generate_ha_fixture.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,13 +42,13 @@ def generate_airos_fixtures() -> None:
_LOGGER.info("Processing '%s'...", filename)

try:
with open(base_fixture_path) as source:
with open(base_fixture_path, encoding="utf-8") as source:
source_data = json.loads(source.read())

derived_data = AirOS.derived_data(None, source_data) # type: ignore[arg-type]
new_data = AirOSData.from_dict(derived_data)

Comment on lines 48 to 50
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Stop calling an instance method with a dummy self; make derived_data a staticmethod

This script currently calls AirOS.derived_data(None, source_data), which is a code smell and will trip type checkers/linters. The method doesn’t use self; promote it to a @staticmethod and call it with a single argument.

Apply this diff here:

-                derived_data = AirOS.derived_data(None, source_data)  # type: ignore[arg-type]
+                derived_data = AirOS.derived_data(source_data)

And update the method in airos/airos8.py (outside this file) as:

# In airos/airos8.py
@staticmethod
def derived_data(response: dict[str, Any]) -> dict[str, Any]:
    ...

I can push the coordinated changes across both files if you want.

🤖 Prompt for AI Agents
In script/generate_ha_fixture.py around lines 48-50, stop calling the instance
method with a dummy self; replace AirOS.derived_data(None, source_data) with
AirOS.derived_data(source_data) and remove the type: ignore. Also update
airos/airos8.py so derived_data is declared as a @staticmethod with signature
def derived_data(response: dict[str, Any]) -> dict[str, Any]: (remove self), and
update any other callers to pass only the response argument.

with open(new_fixture_path, "w") as new:
with open(new_fixture_path, "w", encoding="utf-8") as new:
json.dump(new_data.to_dict(), new, indent=2, sort_keys=True)

_LOGGER.info("Successfully created '%s'", new_filename)
Expand Down
20 changes: 14 additions & 6 deletions script/mashumaro-step-debug.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,13 @@
import sys
from typing import Any

current_script_dir = os.path.dirname(os.path.abspath(__file__))
project_root_dir = os.path.abspath(os.path.join(current_script_dir, os.pardir))
_current_script_dir = os.path.dirname(os.path.abspath(__file__))
_project_root_dir = os.path.abspath(os.path.join(_current_script_dir, os.pardir))

if project_root_dir not in sys.path:
sys.path.append(project_root_dir)
if _project_root_dir not in sys.path:
sys.path.append(_project_root_dir)

from airos.data import AirOS8Data, Remote, Station, Wireless # noqa: E402
from airos.data import AirOS8Data, Interface, Remote, Station, Wireless # noqa: E402

logging.basicConfig(level=logging.DEBUG, stream=sys.stdout)
_LOGGER = logging.getLogger(__name__)
Expand All @@ -31,7 +31,7 @@ def main() -> None:
sys.path.append(project_root_dir)

# Load the JSON data
with open(sys.argv[1]) as f:
with open(sys.argv[1], encoding="utf-8") as f:
data = json.loads(f.read())

try:
Expand Down Expand Up @@ -63,6 +63,14 @@ def main() -> None:
wireless_obj = Wireless.from_dict(wireless_data) # noqa: F841
_LOGGER.info(" -> Success! The Wireless object is valid.")

_LOGGER.info(" -> Checking list of Interface objects...")
interfaces = data["interfaces"]
for i, interface_data in enumerate(interfaces):
_LOGGER.info(" -> Checking Interface object at index %s...", i)
_LOGGER.info(" Interface should be %s.", interface_data["ifname"])
interface_obj = Interface.from_dict(interface_data) # noqa: F841
_LOGGER.info(" Success! Interface is valid.")

_LOGGER.info("Attempting to deserialize full AirOS8Data object...")
airos_data_obj = AirOS8Data.from_dict(data) # noqa: F841
_LOGGER.info("Success! Full AirOS8Data object is valid.")
Expand Down
2 changes: 2 additions & 0 deletions tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,8 @@

import aiohttp

# pylint: disable=redefined-outer-name, unnecessary-default-type-args


@pytest.fixture
def base_url() -> str:
Expand Down
2 changes: 2 additions & 0 deletions tests/test_discovery.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@
from airos.exceptions import AirOSDiscoveryError, AirOSEndpointError, AirOSListenerError
import pytest

# pylint: disable=redefined-outer-name


# Helper to load binary fixture
async def _read_binary_fixture(fixture_name: str) -> bytes:
Expand Down