Apache Airflow version
2.2.1 (latest released)
Operating System
Linux
Versions of Apache Airflow Providers
apache_airflow_providers_amazon-1.4.0-py3-none-any
apache_airflow_providers_ftp-2.0.1-py3-none-any
apache_airflow_providers_http-2.0.1-py3-none-any
apache_airflow_providers_imap-2.0.1-py3-none-any
apache_airflow_providers_postgres-1.0.2-py3-none-any
apache_airflow_providers_slack-4.1.0-py3-none-any
apache_airflow_providers_sqlite-2.0.1-py3-none-any
Deployment
Other Docker-based deployment
Deployment details
No response
What happened
We've been running 2.1.2 using CeleryExecutor without issue. After upgrading to 2.2.1, workers started failing with:
Traceback (most recent call last):
--
File "/home/airflow/.local/bin/airflow", line 8, in <module>
sys.exit(main())
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/__main__.py", line 48, in main
args.func(args)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/cli.py", line 92, in wrapper
return f(*args, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/celery_command.py", line 188, in worker
_run_worker(options=options, skip_serve_logs=skip_serve_logs)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/celery_command.py", line 94, in _run_worker
celery_app.worker_main(options)
File "/home/airflow/.local/lib/python3.8/site-packages/celery/app/base.py", line 365, in worker_main
return instantiate(
File "/home/airflow/.local/lib/python3.8/site-packages/celery/bin/base.py", line 283, in execute_from_commandline
self.maybe_patch_concurrency(argv)
File "/home/airflow/.local/lib/python3.8/site-packages/celery/bin/base.py", line 315, in maybe_patch_concurrency
maybe_patch_concurrency(argv, *pool_option)
File "/home/airflow/.local/lib/python3.8/site-packages/celery/__init__.py", line 143, in maybe_patch_concurrency
pool = _find_option_with_arg(argv, short_opts, long_opts)
File "/home/airflow/.local/lib/python3.8/site-packages/celery/__init__.py", line 95, in _find_option_with_arg
if arg.startswith('-'):
AttributeError: 'int' object has no attribute 'startswith'
What you expected to happen
Workers to be able to function as they did before.
How to reproduce
Changed dependency to apache-airflow 2.2.1 and set up the entry point to run airflow db upgrade. No other functional changes beside swapping out max_active_tasks_per_dag for dag_concurrency in the config.
Anything else
No response
Are you willing to submit PR?
Code of Conduct
Apache Airflow version
2.2.1 (latest released)
Operating System
Linux
Versions of Apache Airflow Providers
apache_airflow_providers_amazon-1.4.0-py3-none-any
apache_airflow_providers_ftp-2.0.1-py3-none-any
apache_airflow_providers_http-2.0.1-py3-none-any
apache_airflow_providers_imap-2.0.1-py3-none-any
apache_airflow_providers_postgres-1.0.2-py3-none-any
apache_airflow_providers_slack-4.1.0-py3-none-any
apache_airflow_providers_sqlite-2.0.1-py3-none-any
Deployment
Other Docker-based deployment
Deployment details
No response
What happened
We've been running 2.1.2 using CeleryExecutor without issue. After upgrading to 2.2.1, workers started failing with:
What you expected to happen
Workers to be able to function as they did before.
How to reproduce
Changed dependency to apache-airflow 2.2.1 and set up the entry point to run
airflow db upgrade. No other functional changes beside swapping outmax_active_tasks_per_dagfordag_concurrencyin the config.Anything else
No response
Are you willing to submit PR?
Code of Conduct