Develop#19
Conversation
- Check the CI workflow files and adjust. - Update the codes to pass some CI checks.
Remove repo-root .dockerignore, keep start/stop-weblate.sh local-only via .gitignore, and drop stale REUSE annotation for the removed file.
Regenerated with DJANGO_SETTINGS_MODULE=weblate.settings_test, CI_* database env, and migrations applied so the spec matches api.yml / spectacular on CI.
The API Lint job inherits DJANGO_SETTINGS_MODULE=weblate.settings_test, but docs/specs/openapi.yaml is produced with default weblate.settings (localhost). Override the Generate OpenAPI step so regenerated spec matches the committed file.
- settings_example: DATABASES honor CI_* like CI runners; SITE_DOMAIN and DATA_DIR from DJANGO_SITE_DOMAIN / WEBLATE_DATA_DIR for reproducible spec. - API workflow and docs/Makefile use weblate.settings_example with the same env as local (no gitignored settings.py); fixes exit 2 when weblate.settings pointed at Postgres :5432 while CI maps :60000. - Refresh openapi.yaml (VcsEnum without optional github backend when absent).
- subprocess.run: set check=False where returncode is handled - create_component_and_add_translation: RuntimeError, rename unused project - Django scripts: pylint disable for imports after django.setup()
…rvices - Extend scripts per-file-ignores to scripts/**/*.py (nested auto/backup scripts) so ruff matches upstream intent of scripts/* for standalone tooling - Fix D205/TRY300 in boost_endpoint/services (docstrings, try/else for returns)
- Move success returns into try body (pylint) - Ignore TRY300 for this file (same tradeoff as openrouter_translator)
- Mark Codecov patch status as informational so large PR diffs are not blocked. - Add Spectacular post-hook to derive operation summaries from descriptions. - Regenerate docs/specs/openapi.yaml so verify-openapi and Redocly pass.
- Post-process webhooks: set operationId and remove DRF error examples from webhook request bodies (they referenced the messaging schema incorrectly). - Align ProjectMachinerySettings OpenApiExample with ProjectMachinerySettings schema. - Regenerate docs/specs/openapi.yaml.
actions/checkout defaults to the merge ref for pull_request. Regenerating openapi on that tree differs from docs/specs/openapi.yaml committed on the branch; git diff --exit-code then fails. Use the PR head SHA so generation and commit are the same snapshot.
Add gerrit, mercurial, and subversion to VcsEnum and vcs field descriptions. These backends were missing because git-svn, mercurial, and git-review were not installed locally when the spec was last generated.
…eam v5.16.1 Remove custom Redocly post-hooks (ensure_operation_summaries, fix_webhook_operations_for_redocly) and the serializer example change that diverged from upstream. Regenerate openapi.yaml accordingly. Redocly lint still passes (warnings only, no errors).
- Address SC2012/SC2162/SC2181/SC1091 in scripts/backup/restore_to_local.sh - Regenerate docs/specs/openapi.yaml: add github VCS and align servers URL with settings_test output
…ation CI spectacular run does not emit github in VcsEnum; keeping it caused git diff to fail after make update-openapi.
Feature/docker deploy
📝 WalkthroughWalkthroughThis PR introduces Docker-based CI/CD deployment infrastructure, adds support for the QuickBook format, refactors error handling and type annotations across weblate modules, removes legacy startup/shutdown scripts, consolidates backup script logic, and updates configuration files for REUSE license compliance. Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 inconclusive)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 9
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (3)
weblate/formats/asciidoc.py (1)
1-18:⚠️ Potential issue | 🟠 MajorMissing
from __future__ import annotationsimport.The coding guidelines require this import for all Python files. Add it after the license header.
Proposed fix
# Copyright © Boost Organization <boost@boost.org> # # SPDX-License-Identifier: GPL-3.0-or-later +from __future__ import annotations + """AsciiDoc file format support for Weblate (po4a-based)."""🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@weblate/formats/asciidoc.py` around lines 1 - 18, This file is missing the required future import; add "from __future__ import annotations" immediately after the license/header comment at the top of the module (before any other imports) so that forward references use postponed evaluation—place it above the existing imports (e.g., before os, pathlib, shutil) in the weblate.formats.asciidoc module.weblate/boost_endpoint/services.py (2)
142-152:⚠️ Potential issue | 🟡 MinorPreserve
report_error()on clone failures.Switching
git clonetocheck=Falsemeans the common failure path now only logs and returnsFalse. That drops the centralized error reporting this service otherwise uses for repository failures, which will make branch/auth/outage problems much harder to trace in production.Suggested change
if result.returncode != 0: LOGGER.error("Failed to clone: %s", result.stderr) + report_error(cause="Boost component clone") return FalseBased on learnings, "Use
weblate.utils.errors.report_error()for error handling and reporting."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@weblate/boost_endpoint/services.py` around lines 142 - 152, The clone failure path currently only logs and returns False after running subprocess.run with check=False, which drops centralized reporting; update the error handling so that when result.returncode != 0 you call weblate.utils.errors.report_error(...) with the command, exit code and stderr (alongside the existing LOGGER.error call) and then return False; locate the block around subprocess.run (variables cmd and result) and ensure report_error is imported/used to preserve centralized repository failure reporting.
694-700:⚠️ Potential issue | 🟠 MajorDon't swallow a failed
git statushere.With
check=False, a broken repository state now looks the same as "no staged changes": the method skips commit/push and still counts the component as deleted. This should still fail the cleanup path or at least append an explicit error whengit statusreturns non-zero.Suggested change
status = subprocess.run( ["git", "-C", base_path, "status", "--porcelain"], capture_output=True, text=True, timeout=10, - check=False, + check=False, ) + if status.returncode != 0: + raise subprocess.CalledProcessError( + status.returncode, + status.args, + output=status.stdout, + stderr=status.stderr, + ) if status.stdout.strip():As per coding guidelines, "Always use try/except for VCS operations and handle errors gracefully".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@weblate/boost_endpoint/services.py` around lines 694 - 700, The subprocess.run call performing git status (the invocation using ["git", "-C", base_path, "status", "--porcelain"] with capture_output/text/timeout/check=False) is currently swallowing failures; wrap that call in a try/except and treat non-zero return codes as errors (or inspect status.returncode) so the cleanup path does not silently proceed — on error either raise or append a clear error to the component's error list/log and skip marking the component as deleted; update the code around that subprocess.run usage to use check=True or explicitly handle status.returncode != 0 and include stderr in the error message so VCS failures are surfaced.
🧹 Nitpick comments (3)
weblate/formats/asciidoc.py (1)
65-68: Consider moving import to top level or using TYPE_CHECKING.The import of
STATE_FUZZYinside the method body works but is non-idiomatic. If this is to avoid circular imports, consider usingTYPE_CHECKINGguard or restructuring.Move import to module level
from weblate.formats.convert import ConvertFormat from weblate.utils.errors import report_error +from weblate.utils.state import STATE_FUZZYThen remove the inner import at line 65-66.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@weblate/formats/asciidoc.py` around lines 65 - 68, The code currently imports STATE_FUZZY inside the function (near existing_unit/state check) which is non-idiomatic; move the import of STATE_FUZZY to the module level in weblate.formats.asciidoc (or, if this was to avoid a circular import, add the import under a typing.TYPE_CHECKING guard and reference the symbol at runtime via a local import only when necessary), then remove the inner import and keep the existing_unit/state check and thepo.markfuzzy(True) as-is; ensure the import references STATE_FUZZY exactly and that no circular import is introduced.scripts/backup/recalculate_stats.py (1)
6-9: Consider addingfrom __future__ import annotationsand type hints.Per coding guidelines, Python files should use
from __future__ import annotationsfor forward references, and type hints are required.Suggested change
#!/usr/bin/env python3 # Copyright © Boost Organization <boost@boost.org> # # SPDX-License-Identifier: GPL-3.0-or-later +from __future__ import annotations + """Recalculate statistics for all components in a project.""" import os import sysAnd for the function signature:
-def recalculate_stats(project_slug): +def recalculate_stats(project_slug: str) -> None:🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/backup/recalculate_stats.py` around lines 6 - 9, Add "from __future__ import annotations" at the top of scripts/backup/recalculate_stats.py and annotate all function signatures and public variables with type hints (e.g., main(), recalculate_stats(), any helper functions and their parameters/returns) using PEP 484 style; ensure imports for typing (Optional, List, Dict, Any, Path) are added where used and update local variable annotations inside functions (e.g., os/path-related variables) so the module complies with the project's type-hinting guidelines..github/workflows/cd.yml (1)
1-13: Serialize deploy runs for this environment.Two manual dispatches can currently race on the same host and interleave
git pull/docker composeoperations. Adding a workflow-level concurrency group avoids overlapping production deploys.Suggested change
name: CD on: workflow_dispatch: + +concurrency: + group: production-cd + cancel-in-progress: false # Restrict GITHUB_TOKEN to the minimum (Scorecard / OpenSSF); deploy uses SSH secrets, not the token. permissions: contents: read🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/cd.yml around lines 1 - 13, Add a workflow-level concurrency block to the "CD" workflow to serialize deploy runs so manual dispatches cannot overlap; add a top-level concurrency stanza (not inside job) with a stable group name that identifies this environment (e.g., "cd-deploy-production" or use an expression like github.workflow + "-deploy") and set cancel-in-progress to false so new manual dispatches queue rather than cancel running jobs; ensure the concurrency block is present for the workflow named "CD" that contains the "deploy" job.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/cd.yml:
- Around line 29-30: Remove the destructive docker compose down step and perform
an in-place atomic replacement by running only the reconciliation command (keep
or update the existing docker compose up -d --build invocation); specifically,
delete the "docker compose down" invocation and ensure the workflow uses "docker
compose up -d --build" (optionally adding flags like --remove-orphans or
--no-recreate if desired) so the live stack is not torn down before the new
containers are ready.
In `@scripts/auto/setup_project.py`:
- Around line 50-53: Validate and sanitize the repository URL and branch before
building and running the git clone command: ensure clone_url (repo_url) uses an
allowed scheme (e.g., https or ssh/git@) by parsing it (reject unsafe schemes
like file:, data:, or empty strings) and normalize/whitelist acceptable
patterns, and validate branch against a strict regex (allow letters, digits,
dot, underscore, hyphen, slash; disallow leading hyphen or slash and control
characters) and reject or raise ValueError/exit if invalid; update the code
around cmd/subprocess.run (references: cmd, clone_url, branch, target_dir,
subprocess.run) to perform these checks and only call subprocess.run when both
validations pass.
In `@scripts/backup/restore_to_local.sh`:
- Around line 45-47: The current DB_SQL and FILES_ARCHIVE assignments use "find
... | head -n1" which is nondeterministic; update both assignments (DB_SQL and
FILES_ARCHIVE) to sort the matching filenames by name (use a stable
numeric/version aware sort such as sort -V or lexicographic reverse sort) and
then pick the first entry so the newest timestamped filename is chosen
deterministically; keep the existing find and sed steps but insert a pipe to
sort (e.g. sort -V -r or sort -r) before head -n1.
In `@scripts/spdx-license-list`:
- Line 1: The submodule reference in scripts/spdx-license-list points to an
unreachable SHA (f7b69b12cf4c063d9c42c0c72945978c87f3192c) on the remote; fix by
updating the submodule to a reachable commit or pushing that commit to the
remote. Locate the submodule entry in scripts/spdx-license-list (and .gitmodules
if present), change the commit to a valid SHA from
https://github.com/spdx/license-list-data.git (or re-add the correct remote and
push the missing commit), then run the usual submodule update (git submodule
sync && git submodule update --init --recursive) and commit the updated
submodule reference.
In `@weblate-docker`:
- Line 1: The submodule pointer currently references commit
9ba5d7396580ccda8bd4e34f109e59831d60f5be which is not on the tracked main
branch; update the submodule to a commit that exists on the main branch (e.g.,
the current main tip 64db60c585d76e0bb9e86ca24f3e67bd7825181e) by entering the
submodule directory, fetching remote refs, checking out or fast-forwarding to
main, updating the worktree to the desired main commit, then git-add and commit
the updated submodule pointer in the parent repository (ensure .gitmodules still
tracks branch = main and push the parent repo commit).
In `@weblate/formats/quickbook.py`:
- Around line 110-125: The code currently logs a mismatch between tmpl_units and
trans_units using report_error but continues, leaving blank targets; change it
to abort the import by raising an exception after logging: in the block that
checks if len(tmpl_units) != len(trans_units) (and references tmpl_units,
trans_units, report_error, storefile_path, Path(storefile_path).name), call
report_error as now and then raise a descriptive exception (e.g., ValueError or
a custom ImportError) so the caller fails instead of returning a store with
empty targets; likewise ensure the outer except block that catches Exception as
exc still re-raises (after logging) so parse errors don’t get swallowed.
In `@weblate/settings_docker.py`:
- Around line 1497-1503: The new settings introduce nonconforming env names and
change the default sleep from 150s to 300s; rename the keys to follow the file's
WEBLATE_* convention (e.g., use WEBLATE_AUTO_BATCH_TRANSLATE_VIA_OPENROUTER and
WEBLATE_BOOST_ENDPOINT_ADD_TRANSLATION_SECONDS) and keep the default
BOOST_ENDPOINT_ADD_TRANSLATION_SECONDS value at 150 to preserve existing
behavior; update the get_env_bool/get_env_int calls that set
AUTO_BATCH_TRANSLATE_VIA_OPENROUTER and BOOST_ENDPOINT_ADD_TRANSLATION_SECONDS
accordingly so we don't alter the autobatch timing used by
add_new_language/time.sleep(ADD_TRANSLATION_SECONDS) in
weblate/boost_endpoint/services.py.
In `@weblate/trans/tasks.py`:
- Around line 822-825: When constructing the synthetic request (the
AuthenticatedHttpRequest instance) before invoking component methods, guard the
User.objects.get(pk=user_id) call against the user being deleted by catching the
User.DoesNotExist (or Exception) and leaving request as None; update the code
around AuthenticatedHttpRequest / request assignment so that if the lookup fails
you fall back to request = None (the component methods already accept
request=None). Apply the same change to the other similar block that creates the
synthetic request.
In `@weblate/utils/openrouter_translator.py`:
- Around line 142-150: The code currently only checks raw_content is None after
raw_content = completion.choices[0].message.content, but treats empty or
whitespace-only strings as valid and later fails in JSON parsing; update the
check so that after assigning raw_content you treat both None and
blank/whitespace-only values as invalid (e.g., if raw_content is None or not
raw_content.strip():), call self.log_error("Batch translation API returned empty
message content") and raise the same ValueError (or similar) before creating
response_text; this change should be applied in the method containing
raw_content/response_text handling in openrouter_translator.py so blank outputs
are rejected early.
---
Outside diff comments:
In `@weblate/boost_endpoint/services.py`:
- Around line 142-152: The clone failure path currently only logs and returns
False after running subprocess.run with check=False, which drops centralized
reporting; update the error handling so that when result.returncode != 0 you
call weblate.utils.errors.report_error(...) with the command, exit code and
stderr (alongside the existing LOGGER.error call) and then return False; locate
the block around subprocess.run (variables cmd and result) and ensure
report_error is imported/used to preserve centralized repository failure
reporting.
- Around line 694-700: The subprocess.run call performing git status (the
invocation using ["git", "-C", base_path, "status", "--porcelain"] with
capture_output/text/timeout/check=False) is currently swallowing failures; wrap
that call in a try/except and treat non-zero return codes as errors (or inspect
status.returncode) so the cleanup path does not silently proceed — on error
either raise or append a clear error to the component's error list/log and skip
marking the component as deleted; update the code around that subprocess.run
usage to use check=True or explicitly handle status.returncode != 0 and include
stderr in the error message so VCS failures are surfaced.
In `@weblate/formats/asciidoc.py`:
- Around line 1-18: This file is missing the required future import; add "from
__future__ import annotations" immediately after the license/header comment at
the top of the module (before any other imports) so that forward references use
postponed evaluation—place it above the existing imports (e.g., before os,
pathlib, shutil) in the weblate.formats.asciidoc module.
---
Nitpick comments:
In @.github/workflows/cd.yml:
- Around line 1-13: Add a workflow-level concurrency block to the "CD" workflow
to serialize deploy runs so manual dispatches cannot overlap; add a top-level
concurrency stanza (not inside job) with a stable group name that identifies
this environment (e.g., "cd-deploy-production" or use an expression like
github.workflow + "-deploy") and set cancel-in-progress to false so new manual
dispatches queue rather than cancel running jobs; ensure the concurrency block
is present for the workflow named "CD" that contains the "deploy" job.
In `@scripts/backup/recalculate_stats.py`:
- Around line 6-9: Add "from __future__ import annotations" at the top of
scripts/backup/recalculate_stats.py and annotate all function signatures and
public variables with type hints (e.g., main(), recalculate_stats(), any helper
functions and their parameters/returns) using PEP 484 style; ensure imports for
typing (Optional, List, Dict, Any, Path) are added where used and update local
variable annotations inside functions (e.g., os/path-related variables) so the
module complies with the project's type-hinting guidelines.
In `@weblate/formats/asciidoc.py`:
- Around line 65-68: The code currently imports STATE_FUZZY inside the function
(near existing_unit/state check) which is non-idiomatic; move the import of
STATE_FUZZY to the module level in weblate.formats.asciidoc (or, if this was to
avoid a circular import, add the import under a typing.TYPE_CHECKING guard and
reference the symbol at runtime via a local import only when necessary), then
remove the inner import and keep the existing_unit/state check and
thepo.markfuzzy(True) as-is; ensure the import references STATE_FUZZY exactly
and that no circular import is introduced.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: ec030e55-6fc1-4d7a-b25b-b16dd8fc9e2d
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (30)
.github/workflows/cd.yml.github/workflows/macos.yml.gitignore.gitmodulesREUSE.tomlcodecov.ymldocs/specs/openapi.yamlpyproject.tomlscripts/auto/create_component_and_add_translation.pyscripts/auto/setup_project.pyscripts/backup/backup_from_server.shscripts/backup/dump_database.shscripts/backup/recalculate_stats.pyscripts/backup/restore_to_local.shscripts/backup/sync_database_to_files.pyscripts/backup/update_push_urls.pyscripts/spdx-license-liststart-weblate.shstop-weblate.shweblate-dockerweblate/boost_endpoint/services.pyweblate/boost_endpoint/views.pyweblate/formats/asciidoc.pyweblate/formats/quickbook.pyweblate/settings_docker.pyweblate/trans/autobatchtranslate.pyweblate/trans/models/component.pyweblate/trans/tasks.pyweblate/utils/openrouter_translator.pyweblate/utils/quickbook.py
💤 Files with no reviewable changes (2)
- stop-weblate.sh
- start-weblate.sh
Summary by CodeRabbit
New Features
Bug Fixes
Documentation
Chores