Skip to content

KubernetesPodOperator namespace argument conflict when using pod_template_file #10037

@SilvrDuck

Description

@SilvrDuck

Apache Airflow version: 1.10.10

Kubernetes version (if you are using kubernetes) (use kubectl version): v1.16.11-gke.5

Environment:

  • Cloud provider or hardware configuration: GKE
  • OS (e.g. from /etc/os-release): –
  • Kernel (e.g. uname -a): –
  • Install tools: helm
  • Others: –

What happened:

I tried to create a job in my DAG using the KubernetesPodOperator, and configuring it with a .yaml file (with the pod_template_file argument).

When specifying the namespace argument for the KubernetesPodOperator, I got the following error :

[2020-07-28 13:27:02,689] {taskinstance.py:1150} ERROR - Pod Launching failed: Cannot configure pod and pass either `pod` or `pod_template_file`. Fields ['namespace'] passed.
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow/contrib/operators/kubernetes_pod_operator.py", line 291, in execute
    final_state, _, result = self.create_new_pod_for_operator(labels, launcher)
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow/contrib/operators/kubernetes_pod_operator.py", line 338, in create_new_pod_for_operator
    pod = pod_generator.PodGenerator(
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow/kubernetes/pod_generator.py", line 205, in __init__
    self.validate_pod_generator_args(locals())
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow/kubernetes/pod_generator.py", line 553, in validate_pod_generator_args
    raise AirflowConfigException(
airflow.exceptions.AirflowConfigException: Cannot configure pod and pass either `pod` or `pod_template_file`. Fields ['namespace'] passed.

When not specifying the argument on the other hand, I got the following error:

[2020-07-28 13:13:24,938] {taskinstance.py:1150} ERROR - Missing the required parameter `namespace` when calling `list_namespaced_pod`
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 984, in _run_raw_task
    result = task_copy.execute(context=context)
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow/contrib/operators/kubernetes_pod_operator.py", line 271, in execute
    pod_list = client.list_namespaced_pod(self.namespace, label_selector=label_selector)
  File "/home/airflow/.local/lib/python3.8/site-packages/kubernetes/client/api/core_v1_api.py", line 12803, in list_namespaced_pod
    (data) = self.list_namespaced_pod_with_http_info(namespace, **kwargs)  # noqa: E501
  File "/home/airflow/.local/lib/python3.8/site-packages/kubernetes/client/api/core_v1_api.py", line 12850, in list_namespaced_pod_with_http_info
    raise ValueError("Missing the required parameter `namespace` when calling `list_namespaced_pod`")  # noqa: E501
ValueError: Missing the required parameter `namespace` when calling `list_namespaced_pod`

(note that this error comes from the k8s python api, not from airflow)

Hence a contradiction between the two messages.

What you expected to happen:

I’d expect to see the job being executed on the cluster. The namespace would be specified in the .yaml file (or as a KubernetesPodOperator argument, both would be fine if working).

How to reproduce it:

I tried to create a minimal example to show what I’m talking about.

I have minimal.py:

from datetime import datetime, timedelta
from airflow.models import DAG
from airflow.contrib.operators.kubernetes_pod_operator import KubernetesPodOperator


default_args = {
    "owner": "test",
    "start_date": datetime(2019, 10, 17),
    "retries": 1,
    "retry_delay": timedelta(minutes=5),
}

with DAG("bug-report", schedule_interval=None, default_args=default_args) as dag:

    KubernetesPodOperator(
        task_id="bug-report-task",
        is_delete_operator_pod=False,
        in_cluster=True,
        get_logs=True,
        namespace="test",  # this is the part causing troubles
        pod_template_file="path/to/dags/minimal.yaml",
    )

And minimal.yaml which is some random example job from k8s' doc:

apiVersion: batch/v1
kind: Job
metadata:
  namespace: test
  name: bug-report
spec:
  template:
    spec:
      containers:
      - name: pi
        image: perl
        command: ["perl",  "-Mbignum=bpi", "-wle", "print bpi(2000)"]
      restartPolicy: Never
  backoffLimit: 4

Triggering the DAG, I get the aforementioned errors.

Is that an actual bug or is there something I’m missing?

Best,
Thibault

Metadata

Metadata

Assignees

No one assigned

    Labels

    kind:bugThis is a clearly a bugpriority:criticalShowstopper bug that should be patched immediatelyprovider:cncf-kubernetesKubernetes (k8s) provider related issues

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions