-
Notifications
You must be signed in to change notification settings - Fork 16.5k
Closed as not planned
Labels
Can't ReproduceThe problem cannot be reproducedThe problem cannot be reproducedarea:providerskind:bugThis is a clearly a bugThis is a clearly a bugpending-responseprovider:cncf-kubernetesKubernetes (k8s) provider related issuesKubernetes (k8s) provider related issuesstaleStale PRs per the .github/workflows/stale.yml policy fileStale PRs per the .github/workflows/stale.yml policy file
Description
Apache Airflow Provider(s)
cncf-kubernetes
Versions of Apache Airflow Providers
apache-airflow-providers-cncf-kubernetes 8.3.1
Apache Airflow version
2.9.2
Operating System
Debian GNU/Linux 12 (bookworm)
Deployment
Official Apache Airflow Helm Chart
Deployment details
No response
What happened
If driver pod was deleted (manually), SparkKubernetesOperator task take 'success' status, despite driver wasn't complete all the jobs of SparkApplication.
What you think should happen instead
If driver pod was deleted, but wasn't complete his work, SparkKubernetesOperator task should take 'failed' status.
How to reproduce
Start SparkApplication using SparkKubernetesOperator task. Wait until driver pod will be created. Delete driver pod during it does some work. SparkApplication completes immediately (with status=COMPLETED) and task will become success, despite driver wasn't complete the work.
Anything else
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
Can't ReproduceThe problem cannot be reproducedThe problem cannot be reproducedarea:providerskind:bugThis is a clearly a bugThis is a clearly a bugpending-responseprovider:cncf-kubernetesKubernetes (k8s) provider related issuesKubernetes (k8s) provider related issuesstaleStale PRs per the .github/workflows/stale.yml policy fileStale PRs per the .github/workflows/stale.yml policy file