Airflow ECS Register Task Operator, error when passing env. variables #29504
Replies: 2 comments 6 replies
-
It's a problem with the boto3 client, this parameter is not supported in your old boto3 version. Can you check which version you are using? same for Airflow Aws provider |
Beta Was this translation helpful? Give feedback.
-
I have now solved it. If one looks at the documentation of the boto3 api, the environment variables have to passed in the containerDefinitions section. However I now get the error: AWS Connection (conn_id='aws-fargate', conn_type='aws') credentials retrieved from login and password. INFO - No connection ID provided. Fallback on boto3 credential strategy (region_name=None) botocore.exceptions.NoRegionError: You must specify a region. I tried solving it but so far nothing worked, I have specified a connection_id and also a region inside the connection section in airflows web ui. Can someone maybe explain where this one came from? |
Beta Was this translation helpful? Give feedback.
-
Hello, when I am trying to pass env. variables to my ECSRegisterTaskOperator, I get the following error:
Unknown parameter in input: "environment", must be one of: family, taskRoleArn, executionRoleArn, networkMode, containerDefinitions, volumes, placementConstraints, requiresCompatibilities, cpu, memory, tags, pidMode, ipcMode
However according to the documentation for boto3 with Ecs, one has to pass env variables via the environment dictionary.
[reference: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ecs.html#ECS.Client.register_task_definition ]
The part of my task looks like this:
register_task_kwargs={
"cpu": "256",
"memory": "512",
"networkMode": "awsvpc",
"environment": [
{
"name": "REDSHIFT_HOST",
"value": redshift_cluster_test_connection.host
},
{
"name": "REDSHIFT_USER",
"value": "redshift_cluster_test_connection.user"
},
{ "name": "REDSHIFT_PW",
"value": redshift_cluster_test_connection.password}
]
Beta Was this translation helpful? Give feedback.
All reactions