Under which category would you file this issue?
Task SDK
Apache Airflow version
3.2.1
What happened and how to reproduce it?
I'm running airflow 3.2.1 in EKS with multi_team mode enabled, and using the EKSPodOperator to execute DAGs on remote clusters. To do this, I have configured aws_default connections using AIRFLOW_CONN__<TEAM>___<CONN_ID> (e.g. AIRFLOW_CONN__TEAM_A___AWS_DEFAULT). However, the EKSPodOperator is failing.
I think the issue lies in airflow.sdk.execution_time.context. In _get_connection, SecretCache.get_connection_uri(conn_id) supports passing team_name, but is missing the value when retrieving the connection info from the env.
Steps to Reproduce:
- Enable multi-team mode (core.multi_team = true)
- Set a team-scoped connection via env var:
AIRFLOW_CONN__TEAM_A___AWS_DEFAULT='{"conn_type":"aws","extra":{"role_arn":"..."}}'
- Assign a DAG bundle to myteam
- In a DAG, use an operator that looks up aws_default (e.g. EksPodOperator which defaults to
aws_conn_id="aws_default")
- The team-scoped env var is ignored
What you think should happen instead?
AIRFLOW_CONN__ values should be team scoped, (possibly) with team scoped values taking priority over global values.
Operating System
Debian GNU/Linux 12 (bookworm)
Deployment
Official Apache Airflow Helm Chart
Apache Airflow Provider(s)
No response
Versions of Apache Airflow Providers
apache-airflow==3.2.1
apache-airflow-providers-amazon==9.25.0
apache-airflow-providers-cncf-kubernetes==10.16.0
apache-airflow-providers-fab==3.6.1
Official Helm Chart version
1.21.0 (latest released)
Kubernetes Version
1.34
Helm Chart configuration
No response
Docker Image customizations
❯ cat requirements-frozen.txt
# This file was autogenerated by uv via the following command:
# uv pip compile pyproject.toml --no-deps --no-annotate
apache-airflow==3.2.1
apache-airflow-providers-amazon==9.25.0
apache-airflow-providers-cncf-kubernetes==10.16.0
apache-airflow-providers-fab==3.6.1
boto3==1.42.94
flask-appbuilder==5.2.1
mypy-boto3-dynamodb==1.42.73
- Okta integration (mostly following this) in
$AIRFLOW_HOME/webserver_config.py
- Custom auth manager using
FabAuthManager
Anything else?
No response
Are you willing to submit PR?
Code of Conduct
Under which category would you file this issue?
Task SDK
Apache Airflow version
3.2.1
What happened and how to reproduce it?
I'm running airflow 3.2.1 in EKS with multi_team mode enabled, and using the
EKSPodOperatorto execute DAGs on remote clusters. To do this, I have configured aws_default connections usingAIRFLOW_CONN__<TEAM>___<CONN_ID>(e.g.AIRFLOW_CONN__TEAM_A___AWS_DEFAULT). However, theEKSPodOperatoris failing.I think the issue lies in airflow.sdk.execution_time.context. In
_get_connection,SecretCache.get_connection_uri(conn_id)supports passingteam_name, but is missing the value when retrieving the connection info from the env.Steps to Reproduce:
AIRFLOW_CONN__TEAM_A___AWS_DEFAULT='{"conn_type":"aws","extra":{"role_arn":"..."}}'aws_conn_id="aws_default")What you think should happen instead?
AIRFLOW_CONN__values should be team scoped, (possibly) with team scoped values taking priority over global values.Operating System
Debian GNU/Linux 12 (bookworm)
Deployment
Official Apache Airflow Helm Chart
Apache Airflow Provider(s)
No response
Versions of Apache Airflow Providers
Official Helm Chart version
1.21.0 (latest released)
Kubernetes Version
1.34
Helm Chart configuration
No response
Docker Image customizations
$AIRFLOW_HOME/webserver_config.pyFabAuthManagerAnything else?
No response
Are you willing to submit PR?
Code of Conduct