Skip to content

feat: add daily cleanup workflow for stale CI schemas#2128

Merged
haritamar merged 6 commits intomasterfrom
devin/1772278180-cleanup-stale-ci-schemas
Feb 28, 2026
Merged

feat: add daily cleanup workflow for stale CI schemas#2128
haritamar merged 6 commits intomasterfrom
devin/1772278180-cleanup-stale-ci-schemas

Conversation

@devin-ai-integration
Copy link
Contributor

@devin-ai-integration devin-ai-integration bot commented Feb 28, 2026

feat: add daily cleanup workflow for stale CI schemas

Summary

Adds a scheduled GitHub Actions workflow (cleanup-stale-schemas.yml) that runs daily at 03:00 UTC to drop stale CI schemas (py_-prefixed) from cloud warehouses. Can also be triggered manually with a configurable max-age-hours (default: 24h).

The workflow does not duplicate any cleanup logic. Instead, it checks out dbt-data-reliability at runtime and invokes the drop_stale_ci_schemas dbt macro added in the companion PR: elementary-data/dbt-data-reliability#943 (already merged).

How it works:

  1. Checks out dbt-data-reliability (default branch, no pin)
  2. Installs dbt + the relevant adapter for each cloud target
  3. Generates profiles from dbt-data-reliability's template using CI_WAREHOUSE_SECRETS
  4. Runs dbt run-operation drop_stale_ci_schemas --args '{prefixes: ["py_"], max_age_hours: 24}' per warehouse

Targets: snowflake, bigquery, redshift, databricks_catalog, athena (docker-only targets like postgres/clickhouse are ephemeral and don't need cleanup).

Updates since last revision

  • Schedule changed from weekly to daily: The cleanup job runs fast, so daily execution is preferred to keep stale CI schemas from accumulating. Cron changed from 0 3 * * 0 (Sundays) to 0 3 * * * (daily).
  • Companion PR merged: The dbt-data-reliability companion PR (#943) has been merged to master — this workflow is now safe to merge.

Review & Testing Checklist for Human

  • Verify CI_WAREHOUSE_SECRETS is available as a repo secret in elementary (it should be — existing test workflows use it).
  • YAML quoting on --args: Confirm the --args '{prefixes: ["py_"], max_age_hours: ...}' interpolation works correctly in GitHub Actions, especially the ${{ inputs.max-age-hours || '24' }} expression nested inside single quotes.
  • Test plan: Trigger this workflow manually via workflow_dispatch against one warehouse (e.g. snowflake) with a small max-age-hours value. Check the job logs to confirm schemas are listed and stale ones are dropped.

Notes

  • The cleanup macro itself (with per-adapter dispatch for ClickHouse etc.) is reviewed in the companion dbt-data-reliability PR — already merged.
  • The dbt-data-reliability checkout is unpinned (tracks default branch HEAD). This is intentional for a daily maintenance job but means macro changes in that repo will be picked up automatically.

Requested by: @haritamar
Link to Devin run

Summary by CodeRabbit

  • Chores
    • Added an automated workflow that runs daily (03:00 UTC) and can be triggered manually to clean up stale CI/test schemas across supported warehouses (Snowflake, BigQuery, Redshift, Databricks Catalog, Athena).
    • Supports a configurable max-age-hours input (default 24), validates the input, and runs cleanup per selected warehouse; requires CI warehouse secrets to operate.

Re-uses the elementary.drop_stale_ci_schemas macro from
dbt-data-reliability (checked out at workflow time) to drop
py_-prefixed CI schemas older than 24 hours from cloud warehouses.

Runs weekly on Sunday 03:00 UTC.

Co-Authored-By: Itamar Hartstein <haritamar@gmail.com>
@devin-ai-integration
Copy link
Contributor Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR. Add '(aside)' to your comment to have me ignore it.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

@github-actions
Copy link
Contributor

👋 @devin-ai-integration[bot]
Thank you for raising your pull request.
Please make sure to add tests and document all user-facing changes.
You can do this by editing the docs files in this pull request.

@coderabbitai
Copy link

coderabbitai bot commented Feb 28, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review
📝 Walkthrough

Walkthrough

Adds a GitHub Actions workflow that runs daily (03:00 UTC) and on-demand to remove stale CI schemas across five warehouse types by running a dbt run-operation with configurable max-age-hours (default 24) and fixed prefixes ["py_"], and that validates inputs and required secrets before execution.

Changes

Cohort / File(s) Summary
GitHub Actions Workflow
.github/workflows/cleanup-stale-schemas.yml
New workflow Cleanup stale CI schemas. Triggers: daily cron (03:00 UTC) and workflow_dispatch (input max-age-hours, default 24). Defines TESTS_DIR. Matrix over snowflake, bigquery, redshift, databricks_catalog, athena. Steps: checkout dbt-data-reliability, setup Python 3.10, pip cache, install warehouse-specific dbt package, generate profiles.yml, install test deps, symlink local elementary package, run dbt run-operation drop_stale_ci_schemas with prefixes: ["py_"] and max_age_hours targeted per warehouse; validates MAX_AGE_HOURS is a non-negative integer and requires CI_WAREHOUSE_SECRETS. Fail-fast disabled; uses matrix strategy.

Sequence Diagram(s)

sequenceDiagram
    participant GH as "GitHub Actions"
    participant Runner as "CI Runner"
    participant Repo as "dbt-data-reliability repo"
    participant Env as "Runner env (Python/dbt)"
    participant DBT as "dbt run-operation"
    participant Warehouse as "Warehouse (snowflake / bigquery / redshift / databricks / athena)"

    GH->>Runner: trigger (cron daily 03:00 UTC or manual)
    Runner->>Runner: validate `MAX_AGE_HOURS` input (non-negative integer)
    Runner->>Runner: verify `CI_WAREHOUSE_SECRETS` present
    Runner->>Repo: checkout repo
    Runner->>Env: setup Python 3.10 & pip cache
    Runner->>Env: install warehouse-specific dbt package
    Runner->>Env: generate `profiles.yml` and install test deps
    Runner->>Env: symlink local `elementary` package into tests
    Runner->>DBT: run-operation `drop_stale_ci_schemas` (prefixes:["py_"], max_age_hours)
    DBT->>Warehouse: connect to target and drop stale schemas
    Warehouse-->>DBT: operation result
    DBT-->>Runner: return result
    Runner-->>GH: job status (matrix per warehouse)
Loading

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

Poem

🐇 I hop through CI at dawn's light,

I sniff the schemas left to roam,
With py_ as my lantern and hours in sight,
I tidy the garden and send them home,
A cheerful rabbit, keeping tests light.

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'feat: add daily cleanup workflow for stale CI schemas' accurately and directly describes the main change in the pull request, which introduces a new GitHub Actions workflow for cleaning up stale CI schemas.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch devin/1772278180-cleanup-stale-ci-schemas

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Nitpick comments (1)
.github/workflows/cleanup-stale-schemas.yml (1)

52-60: Fail fast if CI_WAREHOUSE_SECRETS is not configured.

Line 53 silently falls back to empty string. Add an explicit guard so failures are immediate and actionable instead of surfacing later as opaque dbt/profile errors.

✅ Suggested fix
       - name: Write dbt profiles
         env:
           CI_WAREHOUSE_SECRETS: ${{ secrets.CI_WAREHOUSE_SECRETS || '' }}
         run: |
+          if [ -z "${CI_WAREHOUSE_SECRETS}" ]; then
+            echo "::error::Missing required secret: CI_WAREHOUSE_SECRETS"
+            exit 1
+          fi
           # The cleanup job doesn't create schemas, but generate_profiles.py
           # requires --schema-name.  Use a dummy value.
           python "${{ github.workspace }}/dbt-data-reliability/integration_tests/profiles/generate_profiles.py" \
             --template "${{ github.workspace }}/dbt-data-reliability/integration_tests/profiles/profiles.yml.j2" \
             --output ~/.dbt/profiles.yml \
             --schema-name "cleanup_placeholder"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/cleanup-stale-schemas.yml around lines 52 - 60, Add a
fail-fast guard that errors out when CI_WAREHOUSE_SECRETS is unset/empty before
calling generate_profiles.py: check the CI_WAREHOUSE_SECRETS environment
variable (set in the env: CI_WAREHOUSE_SECRETS entry) at the top of the run:
block (e.g., if [ -z "${CI_WAREHOUSE_SECRETS}" ]; then echo
"CI_WAREHOUSE_SECRETS is required"; exit 1; fi) so the workflow exits
immediately with a clear message instead of defaulting to an empty string and
causing opaque errors when running python "${{ github.workspace
}}/dbt-data-reliability/integration_tests/profiles/generate_profiles.py".
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/cleanup-stale-schemas.yml:
- Around line 71-74: Validate and sanitize the inputs.max-age-hours value before
interpolation: ensure inputs.max-age-hours contains only digits (and optionally
enforce a min/max, e.g., 1–168), coerce or fallback to '24' if invalid, and pass
the sanitized value as an environment variable (or separate shell argument) to
the dbt run-operation call (dbt run-operation elementary.drop_stale_ci_schemas)
instead of embedding the raw expression inside the quoted --args string; also
keep the matrix.warehouse-type token intact when passing -t "${{
matrix.warehouse-type }}".
- Around line 33-38: The checkout step that uses actions/checkout@v4 to grab
repository "elementary-data/dbt-data-reliability" should be pinned to an
immutable ref instead of the moving default branch; update the checkout
invocation (the step named "Checkout dbt package") to include a ref set to a
commit SHA or permanent tag (e.g., ref: '<commit-sha-or-tag>') so the workflow
always runs a known immutable revision before executing destructive macros that
drop schemas.
- Around line 45-49: The Install dbt step currently installs unpinned packages
("dbt-core" and "dbt-${{ ... }}"), which can cause breakage; update that run
line to pin both dbt-core and the chosen adapter to the tested dbt 1.8 series
(e.g. use "dbt-core==1.8.0" and "dbt-${{ (matrix.warehouse-type ==
'databricks_catalog' && 'databricks') || (matrix.warehouse-type == 'athena' &&
'athena-community') || matrix.warehouse-type }}==1.8.0") so the matrix variable
expansion still selects the correct adapter but with a fixed version.

---

Nitpick comments:
In @.github/workflows/cleanup-stale-schemas.yml:
- Around line 52-60: Add a fail-fast guard that errors out when
CI_WAREHOUSE_SECRETS is unset/empty before calling generate_profiles.py: check
the CI_WAREHOUSE_SECRETS environment variable (set in the env:
CI_WAREHOUSE_SECRETS entry) at the top of the run: block (e.g., if [ -z
"${CI_WAREHOUSE_SECRETS}" ]; then echo "CI_WAREHOUSE_SECRETS is required"; exit
1; fi) so the workflow exits immediately with a clear message instead of
defaulting to an empty string and causing opaque errors when running python "${{
github.workspace
}}/dbt-data-reliability/integration_tests/profiles/generate_profiles.py".

ℹ️ Review info

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Disabled knowledge base sources:

  • Linear integration is disabled

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 02174cb and 5447b19.

📒 Files selected for processing (1)
  • .github/workflows/cleanup-stale-schemas.yml

Copy link
Contributor Author

@devin-ai-integration devin-ai-integration bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ Devin Review: No Issues Found

Devin Review analyzed this PR and found no potential bugs to report.

View in Devin Review to see 5 additional findings.

Open in Devin Review

The drop_stale_ci_schemas macro moved from the main elementary package
to the integration_tests project in dbt-data-reliability.

Co-Authored-By: Itamar Hartstein <haritamar@gmail.com>
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

♻️ Duplicate comments (3)
.github/workflows/cleanup-stale-schemas.yml (3)

33-38: ⚠️ Potential issue | 🔴 Critical

Pin external checkout to an immutable ref before running schema-drop logic.

Line 36 currently tracks a moving branch; this can change runtime behavior unexpectedly for a destructive cleanup job.

🔒 Suggested fix
       - name: Checkout dbt package
         uses: actions/checkout@v4
         with:
           repository: elementary-data/dbt-data-reliability
+          ref: <immutable-tag-or-commit-sha>
           path: dbt-data-reliability
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/cleanup-stale-schemas.yml around lines 33 - 38, The
checkout step named "Checkout dbt package" currently fetches a moving branch;
update that step to pin the external repo to an immutable ref by adding a ref
input (e.g., a specific commit SHA or tag) for the actions/checkout usage so the
schema-drop logic always runs against a fixed revision; locate the "Checkout dbt
package" step in the workflow and add the ref field alongside repository and
path to ensure deterministic, non-moving behavior.

69-74: ⚠️ Potential issue | 🟠 Major

Validate max-age-hours before building --args.

Line 73 interpolates unsanitized user input directly into the command string.

🛡️ Suggested fix
       - name: Drop stale CI schemas
         working-directory: ${{ env.TESTS_DIR }}/dbt_project
-        run: >
-          dbt run-operation drop_stale_ci_schemas
-          --args '{prefixes: ["py_"], max_age_hours: ${{ inputs.max-age-hours || '24' }}}'
-          -t "${{ matrix.warehouse-type }}"
+        env:
+          MAX_AGE_HOURS: ${{ inputs.max-age-hours || '24' }}
+        run: |
+          if ! [[ "$MAX_AGE_HOURS" =~ ^[0-9]+$ ]]; then
+            echo "::error::max-age-hours must be a non-negative integer"
+            exit 1
+          fi
+          ARGS=$(printf '{"prefixes":["py_"],"max_age_hours":%s}' "$MAX_AGE_HOURS")
+          dbt run-operation drop_stale_ci_schemas \
+            --args "$ARGS" \
+            -t "${{ matrix.warehouse-type }}"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/cleanup-stale-schemas.yml around lines 69 - 74, The
workflow step "Drop stale CI schemas" builds --args using the raw input `${{
inputs.max-age-hours }}`, which allows unsanitized values into the shell
command; validate and sanitize `inputs.max-age-hours` before interpolation by
checking it is a positive integer (or falling back to a safe default like 24)
and only then build the `--args` string; locate the step named "Drop stale CI
schemas" and the interpolation of `max-age-hours` and replace it with a
validated/sanitized variable (e.g., compute a sanitized `MAX_AGE_HOURS` in a
preceding run/if/step or use an expressions-based guard) so the command only
ever receives a numeric value.

45-49: ⚠️ Potential issue | 🟠 Major

Pin dbt-core and adapter versions for workflow stability.

Lines 47-49 install floating versions, which can silently break this scheduled job when upstream releases land.

📦 Suggested fix
       - name: Install dbt
         run: >
           pip install
-          "dbt-core"
-          "dbt-${{ (matrix.warehouse-type == 'databricks_catalog' && 'databricks') || (matrix.warehouse-type == 'athena' && 'athena-community') || matrix.warehouse-type }}"
+          "dbt-core>=1.8,<1.9"
+          "dbt-${{ (matrix.warehouse-type == 'databricks_catalog' && 'databricks') || (matrix.warehouse-type == 'athena' && 'athena-community') || matrix.warehouse-type }}>=1.8,<1.9"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/cleanup-stale-schemas.yml around lines 45 - 49, The
workflow's "Install dbt" step installs floating packages ("dbt-core" and the
interpolated "dbt-${{ ... }}" adapter) which can break jobs when upstream
releases change; update the pip install invocation in that step to pin explicit
versions for dbt-core and the adapter (e.g., use pinned version strings or
variables like DBT_CORE_VERSION and DBT_ADAPTER_VERSION) instead of bare package
names so the matrix expression "dbt-${{ (matrix.warehouse-type ==
'databricks_catalog' && 'databricks') || (matrix.warehouse-type == 'athena' &&
'athena-community') || matrix.warehouse-type }}" installs a specific, pinned
adapter package; ensure the workflow exposes or documents those version
variables and use them in the run command to guarantee reproducible runs.
🧹 Nitpick comments (1)
.github/workflows/cleanup-stale-schemas.yml (1)

52-54: Fail fast when CI_WAREHOUSE_SECRETS is missing.

Using || '' hides misconfiguration until later steps fail less clearly.

✅ Suggested improvement
       - name: Write dbt profiles
         env:
           CI_WAREHOUSE_SECRETS: ${{ secrets.CI_WAREHOUSE_SECRETS || '' }}
         run: |
+          if [ -z "$CI_WAREHOUSE_SECRETS" ]; then
+            echo "::error::Missing required secret: CI_WAREHOUSE_SECRETS"
+            exit 1
+          fi
           # The cleanup job doesn't create schemas, but generate_profiles.py
           # requires --schema-name.  Use a dummy value.
           python "${{ github.workspace }}/dbt-data-reliability/integration_tests/profiles/generate_profiles.py" \
             --template "${{ github.workspace }}/dbt-data-reliability/integration_tests/profiles/profiles.yml.j2" \
             --output ~/.dbt/profiles.yml \
             --schema-name "cleanup_placeholder"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/cleanup-stale-schemas.yml around lines 52 - 54, Remove the
fallback that masks a missing secret and make the job fail fast when
CI_WAREHOUSE_SECRETS is not provided: stop using the "|| ''" default for
CI_WAREHOUSE_SECRETS and add an explicit early check in the workflow (a small
run step that tests -n "$CI_WAREHOUSE_SECRETS" or similar and exits with a clear
error message) so the run aborts immediately when CI_WAREHOUSE_SECRETS is empty
or unset.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Duplicate comments:
In @.github/workflows/cleanup-stale-schemas.yml:
- Around line 33-38: The checkout step named "Checkout dbt package" currently
fetches a moving branch; update that step to pin the external repo to an
immutable ref by adding a ref input (e.g., a specific commit SHA or tag) for the
actions/checkout usage so the schema-drop logic always runs against a fixed
revision; locate the "Checkout dbt package" step in the workflow and add the ref
field alongside repository and path to ensure deterministic, non-moving
behavior.
- Around line 69-74: The workflow step "Drop stale CI schemas" builds --args
using the raw input `${{ inputs.max-age-hours }}`, which allows unsanitized
values into the shell command; validate and sanitize `inputs.max-age-hours`
before interpolation by checking it is a positive integer (or falling back to a
safe default like 24) and only then build the `--args` string; locate the step
named "Drop stale CI schemas" and the interpolation of `max-age-hours` and
replace it with a validated/sanitized variable (e.g., compute a sanitized
`MAX_AGE_HOURS` in a preceding run/if/step or use an expressions-based guard) so
the command only ever receives a numeric value.
- Around line 45-49: The workflow's "Install dbt" step installs floating
packages ("dbt-core" and the interpolated "dbt-${{ ... }}" adapter) which can
break jobs when upstream releases change; update the pip install invocation in
that step to pin explicit versions for dbt-core and the adapter (e.g., use
pinned version strings or variables like DBT_CORE_VERSION and
DBT_ADAPTER_VERSION) instead of bare package names so the matrix expression
"dbt-${{ (matrix.warehouse-type == 'databricks_catalog' && 'databricks') ||
(matrix.warehouse-type == 'athena' && 'athena-community') ||
matrix.warehouse-type }}" installs a specific, pinned adapter package; ensure
the workflow exposes or documents those version variables and use them in the
run command to guarantee reproducible runs.

---

Nitpick comments:
In @.github/workflows/cleanup-stale-schemas.yml:
- Around line 52-54: Remove the fallback that masks a missing secret and make
the job fail fast when CI_WAREHOUSE_SECRETS is not provided: stop using the "||
''" default for CI_WAREHOUSE_SECRETS and add an explicit early check in the
workflow (a small run step that tests -n "$CI_WAREHOUSE_SECRETS" or similar and
exits with a clear error message) so the run aborts immediately when
CI_WAREHOUSE_SECRETS is empty or unset.

ℹ️ Review info

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Disabled knowledge base sources:

  • Linear integration is disabled

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 5447b19 and d70c86b.

📒 Files selected for processing (1)
  • .github/workflows/cleanup-stale-schemas.yml

Co-Authored-By: Itamar Hartstein <haritamar@gmail.com>
@devin-ai-integration devin-ai-integration bot changed the title feat: add weekly cleanup workflow for stale CI schemas feat: add daily cleanup workflow for stale CI schemas Feb 28, 2026
devin-ai-integration bot and others added 2 commits February 28, 2026 20:25
- Pin dbt-core and adapter versions to >=1.8,<1.10
- Validate max-age-hours input is a non-negative integer
- Fail fast when CI_WAREHOUSE_SECRETS secret is missing

Co-Authored-By: Itamar Hartstein <haritamar@gmail.com>
Co-Authored-By: Itamar Hartstein <haritamar@gmail.com>
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/cleanup-stale-schemas.yml:
- Around line 45-49: The workflow step named "Install dbt" currently installs
unpinned packages ("dbt-core" and "dbt-${{ ... }}"); change these to pinned
versions to match other workflows (e.g., use "dbt-core==1.8.*" and pin the
adapter package similarly such as "dbt-databricks==1.8.*" or
"dbt-athena-community==1.8.*" depending on matrix.warehouse-type). Update the
run command that installs "dbt-core" and "dbt-${{ (matrix.warehouse-type... )
}}" so it injects the appropriate pinned adapter package for each matrix value,
mirroring the versioning pattern used by inputs.dbt-version in
test-warehouse.yml/test-github-action.yml.

ℹ️ Review info

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Disabled knowledge base sources:

  • Linear integration is disabled

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between c208087 and a495427.

📒 Files selected for processing (1)
  • .github/workflows/cleanup-stale-schemas.yml

@haritamar haritamar enabled auto-merge (squash) February 28, 2026 20:52
@haritamar haritamar merged commit 9b6dae4 into master Feb 28, 2026
16 checks passed
@haritamar haritamar deleted the devin/1772278180-cleanup-stale-ci-schemas branch February 28, 2026 20:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant