Skip to content

Upgrading from version 2.1.4 to 2.2.0 fails during mysql db upgrade with error Can't DROP 'dag_id' #18942

@hossein-jazayeri

Description

@hossein-jazayeri

Apache Airflow version

2.2.0 (latest released)

Operating System

Debian GNU/Linux 10 (buster)

Versions of Apache Airflow Providers

apache-airflow-providers-celery==2.1.0
apache-airflow-providers-mysql==2.1.1
apache-airflow-providers-postgres==2.3.0
apache-airflow-providers-sqlite==2.0.1

Deployment

Docker-Compose

Deployment details

docker-compose file:

version: "2"

services:
  airflow-webserver:
    build: .
    image: airflow
    command: airflow webserver
    ports:
      - "8080:8080"

  airflow-scheduler:
    image: airflow
    command: airflow scheduler

  airflow-flower:
    image: airflow
    command: airflow celery flower
    ports:
      - "5555:5555"
    depends_on:
      - airflow-celery
      - airflow-webserver
      - airflow-scheduler
      - airflow-worker
      - airflow-broker

  airflow-worker:
    image: airflow
    command: airflow celery worker

  airflow-celery:
    image: mysql:8.0.19
    environment:
      MYSQL_PASSWORD: ...
      MYSQL_USER: ...
      MYSQL_DATABASE: airflow
      MYSQL_HOST: airflow-celery

  airflow-broker:
    image: redis:5.0.7-alpine

volumes:
  dbdata:

Dockerfile:

FROM python:3.8

COPY requirements.txt .
RUN pip install -U pip
RUN pip install -r requirements.txt

requirements.txt:

apache-airflow[celery,postgres,slack,docker,redis,mysql,http]==2.2.0
kombu==4.6.10
python-dotenv
psycopg2-binary
...

What happened

After updating requirements.txt file to use Airflow 2.2.0 instead of 2.1.4, I ran:

~/airflow $ docker-compose build --no-cache
~/airflow $ docker-compose up -d --force
~/airflow $ docker exec -it airflow_airflow-webserver_1 airflow db upgrade

Which throws this exception:

DB: mysql://airflow:***@airflow-celery/airflow
[2021-10-13 12:22:57,699] {db.py:823} INFO - Creating tables
INFO  [alembic.runtime.migration] Context impl MySQLImpl.
INFO  [alembic.runtime.migration] Will assume non-transactional DDL.
INFO  [alembic.runtime.migration] Running upgrade 142555e44c17 -> 7b2661a43ba3, TaskInstance keyed to DagRun
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context
    self.dialect.do_execute(
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
    cursor.execute(statement, parameters)
  File "/usr/local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 206, in execute
    res = self._query(query)
  File "/usr/local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 319, in _query
    db.query(q)
  File "/usr/local/lib/python3.8/site-packages/MySQLdb/connections.py", line 259, in query
    _mysql.connection.query(self, query)
MySQLdb._exceptions.OperationalError: (1091, "Can't DROP 'dag_id'; check that column/key exists")

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/bin/airflow", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.8/site-packages/airflow/__main__.py", line 40, in main
    args.func(args)
  File "/usr/local/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/airflow/utils/cli.py", line 92, in wrapper
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/airflow/cli/commands/db_command.py", line 48, in upgradedb
    db.upgradedb()
  File "/usr/local/lib/python3.8/site-packages/airflow/utils/session.py", line 70, in wrapper
    return func(*args, session=session, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/airflow/utils/db.py", line 824, in upgradedb
    command.upgrade(config, 'heads')
  File "/usr/local/lib/python3.8/site-packages/alembic/command.py", line 320, in upgrade
    script.run_env()
  File "/usr/local/lib/python3.8/site-packages/alembic/script/base.py", line 563, in run_env
    util.load_python_file(self.dir, "env.py")
  File "/usr/local/lib/python3.8/site-packages/alembic/util/pyfiles.py", line 92, in load_python_file
    module = load_module_py(module_id, path)
  File "/usr/local/lib/python3.8/site-packages/alembic/util/pyfiles.py", line 108, in load_module_py
    spec.loader.exec_module(module)  # type: ignore
  File "<frozen importlib._bootstrap_external>", line 848, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/usr/local/lib/python3.8/site-packages/airflow/migrations/env.py", line 107, in <module>
    run_migrations_online()
  File "/usr/local/lib/python3.8/site-packages/airflow/migrations/env.py", line 101, in run_migrations_online
    context.run_migrations()
  File "<string>", line 8, in run_migrations
  File "/usr/local/lib/python3.8/site-packages/alembic/runtime/environment.py", line 851, in run_migrations
    self.get_context().run_migrations(**kw)
  File "/usr/local/lib/python3.8/site-packages/alembic/runtime/migration.py", line 620, in run_migrations
    step.migration_fn(**kw)
  File "/usr/local/lib/python3.8/site-packages/airflow/migrations/versions/7b2661a43ba3_taskinstance_keyed_to_dagrun.py", line 140, in upgrade
    batch_op.create_unique_constraint('dag_run_dag_id_run_id_key', ['dag_id', 'run_id'])
  File "/usr/local/lib/python3.8/contextlib.py", line 120, in __exit__
    next(self.gen)
  File "/usr/local/lib/python3.8/site-packages/alembic/operations/base.py", line 374, in batch_alter_table
    impl.flush()
  File "/usr/local/lib/python3.8/site-packages/alembic/operations/batch.py", line 107, in flush
    fn(*arg, **kw)
  File "/usr/local/lib/python3.8/site-packages/alembic/ddl/mysql.py", line 150, in drop_constraint
    super(MySQLImpl, self).drop_constraint(const)
  File "/usr/local/lib/python3.8/site-packages/alembic/ddl/impl.py", line 340, in drop_constraint
    self._exec(schema.DropConstraint(const))
  File "/usr/local/lib/python3.8/site-packages/alembic/ddl/impl.py", line 197, in _exec
    return conn.execute(construct, multiparams)
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1011, in execute
    return meth(self, multiparams, params)
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/sql/ddl.py", line 72, in _execute_on_connection
    return connection._execute_ddl(self, multiparams, params)
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1068, in _execute_ddl
    ret = self._execute_context(
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1316, in _execute_context
    self._handle_dbapi_exception(
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1510, in _handle_dbapi_exception
    util.raise_(
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
    raise exception
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context
    self.dialect.do_execute(
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
    cursor.execute(statement, parameters)
  File "/usr/local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 206, in execute
    res = self._query(query)
  File "/usr/local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 319, in _query
    db.query(q)
  File "/usr/local/lib/python3.8/site-packages/MySQLdb/connections.py", line 259, in query
    _mysql.connection.query(self, query)
sqlalchemy.exc.OperationalError: (MySQLdb._exceptions.OperationalError) (1091, "Can't DROP 'dag_id'; check that column/key exists")
[SQL: ALTER TABLE dag_run DROP INDEX dag_id]
(Background on this error at: http://sqlalche.me/e/13/e3q8)

Trying to drop the index manually, gives the same output:

~/airflow $ docker exec -it airflow_airflow-celery_1 mysql
mysql> use airflow;
mysql> ALTER TABLE airflow.dag_run DROP INDEX dag_id;
ERROR 1091 (42000): Can't DROP 'dag_id'; check that column/key exists

What you expected to happen

airflow db upgrade to not fail

How to reproduce

  • Copy the provided docker-compose.yml file content in conjunction with Dockerfile & requirements.txt with Airflow 2.1.4
  • Init db
  • build docker containers
  • all services should to be up & running
  • now update requirements.txt to use 2.2.0
  • build docker containers again
  • Run airflow db upgrade command
  • You would see error in stdout as well as worker service fails to run

Anything else

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    duplicateIssue that is duplicated

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions