Skip to content

Creating a backfill via Airflow API results in internal server error if SQLite is used #66726

@Prometheus3375

Description

@Prometheus3375

Under which category would you file this issue?

Airflow Core

Apache Airflow version

3.2.1

Steps to reproduce

  1. Create Dockerfile with the contents below:
FROM apache/airflow:slim-3.2.1-python3.12

ENV AIRFLOW_HOME=/opt/airflow
ENV PYTHONUTF8=1
ENV PYTHONPATH=/opt/airflow

USER airflow

EXPOSE 8080

ENTRYPOINT ["bash", "-c", "airflow db migrate && exec airflow standalone"]
  1. Create directory dags with test.py as follows:
from datetime import timedelta
from typing import Unpack

from airflow import DAG
from airflow.sdk import Context, task_group
from airflow.sdk.definitions.decorators import task

with DAG(
    dag_id="test_dag",
    schedule=timedelta(days=1),
) as dag:
    @task
    def test_task1(**context: Unpack[Context]) -> list[dict[str, int]]:
        return [{'other': 1}, {'other': 2}, {'other': 3}]

    @task_group
    def test_group(data: int, other: int) -> None:

        @task
        def t1(data: int, other: int, **context: Unpack[Context]) -> list[int]:
            return [data, other]

        t1(data, other)

    g = test_group.partial(data=2).expand_kwargs(test_task1())
  1. Build image from the dockerfile: docker build -t test-airflow ..
  2. Run container: docker run --name airflow-test -p 8080:8080 -v ./dags:/opt/airflow/dags -d test-airflow. Note: on Windows use -v .\dags:/opt/airflow/dags instead.
  3. In the container logs get the password for user admin.
  4. Once Airflow is started, open any API client.
  5. Get auth token:
POST http://localhost:8080/auth/token
{
  "username": "admin",
  "password": "password from logs"
}
  1. Create backfill for test_dag. Do not forget to add Bearer Token to headers:
POST http://localhost:8080/api/v2/backfills
{
  "dag_id": "test_dag",
  "from_date": "2025-03-01",
  "to_date": "2025-04-01",
  "run_backwards": false,
  "dag_run_conf": {},
  "reprocess_behavior": "none",
  "max_active_runs": 3
}

Expected result

Request from step 8 returns JSON with the info about created backfill.

Actual result

Request from step 8 returns HTTP status code 500 (Internal Server Error).
However, backfill is successfully created as per GET http://localhost:8080/api/v2/backfills?dag_id=test_dag

Container logs about the error:

api-server | 2026-05-11T17:31:18.318524Z [info     ] creating dag run               [airflow.serialization.definitions.dag] loc=dag.py:526 logical_date=DateTime(2025, 3, 1, 0, 0, 0, tzinfo=Timezone('UTC')) partition_key=None run_after=DateTime(2025, 3, 1, 0, 0, 0, tzinfo=Timezone('UTC')) run_id=backfill__2025-03-01T00:00:00+00:00
api-server | 2026-05-11T17:31:18.419695Z [info     ] Created backfill Dag run.      [airflow.models.backfill] backfill_id=1 dag_id=test_dag info=DagRunInfo(run_after=DateTime(2025, 3, 1, 0, 0, 0, tzinfo=Timezone('UTC')), data_interval=DataInterval(start=DateTime(2025, 3, 1, 0, 0, 0, tzinfo=Timezone('UTC')), end=DateTime(2025, 3, 1, 0, 0, 0, tzinfo=Timezone('UTC'))), partition_date=None, partition_key=None) loc=backfill.py:690
api-server | 2026-05-11T17:31:18.421520Z [info     ] creating dag run               [airflow.serialization.definitions.dag] loc=dag.py:526 logical_date=DateTime(2025, 3, 2, 0, 0, 0, tzinfo=Timezone('UTC')) partition_key=None run_after=DateTime(2025, 3, 2, 0, 0, 0, tzinfo=Timezone('UTC')) run_id=backfill__2025-03-02T00:00:00+00:00
api-server | 2026-05-11T17:31:18.497663Z [info     ] Created backfill Dag run.      [airflow.models.backfill] backfill_id=1 dag_id=test_dag info=DagRunInfo(run_after=DateTime(2025, 3, 2, 0, 0, 0, tzinfo=Timezone('UTC')), data_interval=DataInterval(start=DateTime(2025, 3, 2, 0, 0, 0, tzinfo=Timezone('UTC')), end=DateTime(2025, 3, 2, 0, 0, 0, tzinfo=Timezone('UTC'))), partition_date=None, partition_key=None) loc=backfill.py:690
api-server | 2026-05-11T17:31:18.499467Z [info     ] creating dag run               [airflow.serialization.definitions.dag] loc=dag.py:526 logical_date=DateTime(2025, 3, 3, 0, 0, 0, tzinfo=Timezone('UTC')) partition_key=None run_after=DateTime(2025, 3, 3, 0, 0, 0, tzinfo=Timezone('UTC')) run_id=backfill__2025-03-03T00:00:00+00:00
api-server | 2026-05-11T17:31:18.661214Z [info     ] Created backfill Dag run.      [airflow.models.backfill] backfill_id=1 dag_id=test_dag info=DagRunInfo(run_after=DateTime(2025, 3, 3, 0, 0, 0, tzinfo=Timezone('UTC')), data_interval=DataInterval(start=DateTime(2025, 3, 3, 0, 0, 0, tzinfo=Timezone('UTC')), end=DateTime(2025, 3, 3, 0, 0, 0, tzinfo=Timezone('UTC'))), partition_date=None, partition_key=None) loc=backfill.py:690
api-server | 2026-05-11T17:31:18.663197Z [info     ] creating dag run               [airflow.serialization.definitions.dag] loc=dag.py:526 logical_date=DateTime(2025, 3, 4, 0, 0, 0, tzinfo=Timezone('UTC')) partition_key=None run_after=DateTime(2025, 3, 4, 0, 0, 0, tzinfo=Timezone('UTC')) run_id=backfill__2025-03-04T00:00:00+00:00
api-server | 2026-05-11T17:31:19.260954Z [info     ] Created backfill Dag run.      [airflow.models.backfill] backfill_id=1 dag_id=test_dag info=DagRunInfo(run_after=DateTime(2025, 3, 4, 0, 0, 0, tzinfo=Timezone('UTC')), data_interval=DataInterval(start=DateTime(2025, 3, 4, 0, 0, 0, tzinfo=Timezone('UTC')), end=DateTime(2025, 3, 4, 0, 0, 0, tzinfo=Timezone('UTC'))), partition_date=None, partition_key=None) loc=backfill.py:690
api-server | 2026-05-11T17:31:19.262727Z [info     ] creating dag run               [airflow.serialization.definitions.dag] loc=dag.py:526 logical_date=DateTime(2025, 3, 5, 0, 0, 0, tzinfo=Timezone('UTC')) partition_key=None run_after=DateTime(2025, 3, 5, 0, 0, 0, tzinfo=Timezone('UTC')) run_id=backfill__2025-03-05T00:00:00+00:00
api-server | 2026-05-11T17:31:19.962730Z [info     ] Created backfill Dag run.      [airflow.models.backfill] backfill_id=1 dag_id=test_dag info=DagRunInfo(run_after=DateTime(2025, 3, 5, 0, 0, 0, tzinfo=Timezone('UTC')), data_interval=DataInterval(start=DateTime(2025, 3, 5, 0, 0, 0, tzinfo=Timezone('UTC')), end=DateTime(2025, 3, 5, 0, 0, 0, tzinfo=Timezone('UTC'))), partition_date=None, partition_key=None) loc=backfill.py:690
api-server | 2026-05-11T17:31:19.964579Z [info     ] creating dag run               [airflow.serialization.definitions.dag] loc=dag.py:526 logical_date=DateTime(2025, 3, 6, 0, 0, 0, tzinfo=Timezone('UTC')) partition_key=None run_after=DateTime(2025, 3, 6, 0, 0, 0, tzinfo=Timezone('UTC')) run_id=backfill__2025-03-06T00:00:00+00:00
api-server | 2026-05-11T17:31:20.098439Z [info     ] Created backfill Dag run.      [airflow.models.backfill] backfill_id=1 dag_id=test_dag info=DagRunInfo(run_after=DateTime(2025, 3, 6, 0, 0, 0, tzinfo=Timezone('UTC')), data_interval=DataInterval(start=DateTime(2025, 3, 6, 0, 0, 0, tzinfo=Timezone('UTC')), end=DateTime(2025, 3, 6, 0, 0, 0, tzinfo=Timezone('UTC'))), partition_date=None, partition_key=None) loc=backfill.py:690
api-server | 2026-05-11T17:31:20.100228Z [info     ] creating dag run               [airflow.serialization.definitions.dag] loc=dag.py:526 logical_date=DateTime(2025, 3, 7, 0, 0, 0, tzinfo=Timezone('UTC')) partition_key=None run_after=DateTime(2025, 3, 7, 0, 0, 0, tzinfo=Timezone('UTC')) run_id=backfill__2025-03-07T00:00:00+00:00
api-server | 2026-05-11T17:31:20.238051Z [info     ] Created backfill Dag run.      [airflow.models.backfill] backfill_id=1 dag_id=test_dag info=DagRunInfo(run_after=DateTime(2025, 3, 7, 0, 0, 0, tzinfo=Timezone('UTC')), data_interval=DataInterval(start=DateTime(2025, 3, 7, 0, 0, 0, tzinfo=Timezone('UTC')), end=DateTime(2025, 3, 7, 0, 0, 0, tzinfo=Timezone('UTC'))), partition_date=None, partition_key=None) loc=backfill.py:690
api-server | 2026-05-11T17:31:20.239764Z [info     ] creating dag run               [airflow.serialization.definitions.dag] loc=dag.py:526 logical_date=DateTime(2025, 3, 8, 0, 0, 0, tzinfo=Timezone('UTC')) partition_key=None run_after=DateTime(2025, 3, 8, 0, 0, 0, tzinfo=Timezone('UTC')) run_id=backfill__2025-03-08T00:00:00+00:00
api-server | 2026-05-11T17:31:20.311646Z [info     ] Created backfill Dag run.      [airflow.models.backfill] backfill_id=1 dag_id=test_dag info=DagRunInfo(run_after=DateTime(2025, 3, 8, 0, 0, 0, tzinfo=Timezone('UTC')), data_interval=DataInterval(start=DateTime(2025, 3, 8, 0, 0, 0, tzinfo=Timezone('UTC')), end=DateTime(2025, 3, 8, 0, 0, 0, tzinfo=Timezone('UTC'))), partition_date=None, partition_key=None) loc=backfill.py:690
api-server | 2026-05-11T17:31:20.313395Z [info     ] creating dag run               [airflow.serialization.definitions.dag] loc=dag.py:526 logical_date=DateTime(2025, 3, 9, 0, 0, 0, tzinfo=Timezone('UTC')) partition_key=None run_after=DateTime(2025, 3, 9, 0, 0, 0, tzinfo=Timezone('UTC')) run_id=backfill__2025-03-09T00:00:00+00:00
api-server | 2026-05-11T17:31:20.318924Z [info     ] request finished               [http.access] client_addr=172.17.0.1:37540 duration_us=2576580 loc=http_access_log.py:98 method=POST path=/api/v2/backfills query= status_code=500
api-server | 2026-05-11T17:31:20.319456Z [error    ] Exception in ASGI application
api-server | [uvicorn.error] loc=httptools_impl.py:425
api-server | Traceback (most recent call last):
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context
api-server | self.dialect.do_execute(
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 952, in do_execute
api-server | cursor.execute(statement, parameters)
api-server | sqlite3.OperationalError: database is locked
api-server | The above exception was the direct cause of the following exception:
api-server | Traceback (most recent call last):
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/uvicorn/protocols/http/httptools_impl.py", line 420, in run_asgi
api-server | result = await app(  # type: ignore[func-returns-value]
api-server | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/fastapi/applications.py", line 1159, in __call__
api-server | await super().__call__(scope, receive, send)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/applications.py", line 107, in __call__
api-server | await self.middleware_stack(scope, receive, send)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 186, in __call__
api-server | raise exc
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 164, in __call__
api-server | await self.app(scope, receive, _send)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/airflow/api_fastapi/common/http_access_log.py", line 83, in __call__
api-server | await self.app(scope, receive, capture_send)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/middleware/gzip.py", line 29, in __call__
api-server | await responder(scope, receive, send)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/middleware/gzip.py", line 130, in __call__
api-server | await super().__call__(scope, receive, send)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/middleware/gzip.py", line 46, in __call__
api-server | await self.app(scope, receive, self.send_with_compression)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/middleware/base.py", line 191, in __call__
api-server | with recv_stream, send_stream, collapse_excgroups():
api-server | ^^^^^^^^^^^^^^^^^^^^
api-server | File "/usr/python/lib/python3.12/contextlib.py", line 158, in __exit__
api-server | self.gen.throw(value)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/_utils.py", line 87, in collapse_excgroups
api-server | raise exc
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/middleware/base.py", line 193, in __call__
api-server | response = await self.dispatch_func(request, call_next)
api-server | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/airflow/api_fastapi/auth/middlewares/refresh_token.py", line 61, in dispatch
api-server | response = await call_next(request)
api-server | ^^^^^^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/middleware/base.py", line 168, in call_next
api-server | raise app_exc from app_exc.__cause__ or app_exc.__context__
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/middleware/base.py", line 144, in coro
api-server | await self.app(scope, receive_or_disconnect, send_no_error)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 63, in __call__
api-server | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
api-server | raise exc
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
api-server | await app(scope, receive, sender)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
api-server | await self.app(scope, receive, send)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/routing.py", line 716, in __call__
api-server | await self.middleware_stack(scope, receive, send)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/routing.py", line 736, in app
api-server | await route.handle(scope, receive, send)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/routing.py", line 290, in handle
api-server | await self.app(scope, receive, send)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/fastapi/routing.py", line 134, in app
api-server | await wrap_app_handling_exceptions(app, request)(scope, receive, send)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
api-server | raise exc
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
api-server | await app(scope, receive, sender)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/fastapi/routing.py", line 120, in app
api-server | response = await f(request)
api-server | ^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/fastapi/routing.py", line 674, in app
api-server | raw_response = await run_endpoint_function(
api-server | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/fastapi/routing.py", line 330, in run_endpoint_function
api-server | return await run_in_threadpool(dependant.call, **values)
api-server | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/starlette/concurrency.py", line 32, in run_in_threadpool
api-server | return await anyio.to_thread.run_sync(func)
api-server | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/anyio/to_thread.py", line 63, in run_sync
api-server | return await get_async_backend().run_sync_in_worker_thread(
api-server | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 2518, in run_sync_in_worker_thread
api-server | return await future
api-server | ^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 1002, in run
api-server | result = context.run(func, *args)
api-server | ^^^^^^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/airflow/api_fastapi/core_api/routes/public/backfills.py", line 233, in create_backfill
api-server | backfill_obj = _create_backfill(
api-server | ^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/airflow/models/backfill.py", line 627, in _create_backfill
api-server | _create_runs_non_partitioned(
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/airflow/models/backfill.py", line 679, in _create_runs_non_partitioned
api-server | _create_backfill_dag_run_non_partitioned(
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/airflow/models/backfill.py", line 377, in _create_backfill_dag_run_non_partitioned
api-server | dr = dag.create_dagrun(
api-server | ^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/airflow/utils/session.py", line 98, in wrapper
api-server | return func(*args, **kwargs)
api-server | ^^^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/airflow/serialization/definitions/dag.py", line 572, in create_dagrun
api-server | orm_dagrun = _create_orm_dagrun(
api-server | ^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/airflow/utils/session.py", line 98, in wrapper
api-server | return func(*args, **kwargs)
api-server | ^^^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/airflow/serialization/definitions/dag.py", line 1196, in _create_orm_dagrun
api-server | session.flush()
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 4331, in flush
api-server | self._flush(objects)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 4466, in _flush
api-server | with util.safe_reraise():
api-server | ^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/util/langhelpers.py", line 121, in __exit__
api-server | raise exc_value.with_traceback(exc_tb)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 4427, in _flush
api-server | flush_context.execute()
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/orm/unitofwork.py", line 466, in execute
api-server | rec.execute(self)
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/orm/unitofwork.py", line 642, in execute
api-server | util.preloaded.orm_persistence.save_obj(
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/orm/persistence.py", line 93, in save_obj
api-server | _emit_insert_statements(
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/orm/persistence.py", line 1233, in _emit_insert_statements
api-server | result = connection.execute(
api-server | ^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1419, in execute
api-server | return meth(
api-server | ^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/sql/elements.py", line 527, in _execute_on_connection
api-server | return connection._execute_clauseelement(
api-server | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1641, in _execute_clauseelement
api-server | ret = self._execute_context(
api-server | ^^^^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context
api-server | return self._exec_single_context(
api-server | ^^^^^^^^^^^^^^^^^^^^^^^^^^
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context
api-server | self._handle_dbapi_exception(
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 2363, in _handle_dbapi_exception
api-server | raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context
api-server | self.dialect.do_execute(
api-server | File "/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 952, in do_execute
api-server | cursor.execute(statement, parameters)
api-server | sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) database is locked
api-server | [SQL: INSERT INTO dag_run (dag_id, queued_at, logical_date, start_date, end_date, state, run_id, creating_job_id, run_type, triggered_by, triggering_user_name, conf, data_interval_start, data_interval_end, run_after, last_scheduling_decision, log_template_id, created_at, updated_at, clear_number, backfill_id, bundle_version, scheduled_by_job_id, context_carrier, created_dag_version_id, partition_key, partition_date) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) RETURNING id, span_status]
api-server | [parameters: ('test_dag', '2026-05-11 17:31:20.315260', '2025-03-09 00:00:00.000000', None, None, <DagRunState.QUEUED: 'queued'>, 'backfill__2025-03-09T00:00:00+00:00', None, <DagRunType.BACKFILL_JOB: 'backfill'>, 'BACKFILL', 'admin', '{}', '2025-03-09 00:00:00.000000', '2025-03-09 00:00:00.000000', '2025-03-09 00:00:00.000000', None, 2, '2026-05-11 17:31:20.316700', '2026-05-11 17:31:20.316707', 0, 1, None, None, '{"__var": {"traceparent": "00-50eac562494f9c159f72cbe7bba1e998-f2fd18493b34ae19-01"}, "__type": "dict"}', '019e1814a7a5788fbd9faf5352fc16cb', None, None)]
api-server | (Background on this error at: https://sqlalche.me/e/20/e3q8)

Basically, API just fails to return the JSON to the user.

Operating System

Windows 10

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    area:APIAirflow's REST/HTTP APIarea:backfillSpecifically for backfill relatedarea:corekind:bugThis is a clearly a bugneeds-triagelabel for new issues that we didn't triage yet

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions