-
Notifications
You must be signed in to change notification settings - Fork 13.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Airflow 1.10.10 + DAG SERIALIZATION = fails to start manually the DAG's operators #10155
Comments
Interesting one! |
Can you share your DAG please @ozw1z5rd |
And also I would strongly suggest upgrading to Python 3 |
You must enable dag serialisation to replicate my issue, no serialisation no issue on company's system. These are my setting ( from pilot installation )
Any dag is affected, my tests where on this specific one:
I have to say that after the database migration I changes the database a bit:
I was before these changes ( mostly the on on rendered_task_instance_fields ) i was unable to manually trigger the same task twice and get the two execution completed without errors. One completed, the other was unable to make the insert into the rendered_task_instance_fields:
After the change on excution_time anything worked fine. |
Definiitely Python 3 is one of the best choices you can make now @ozw1z5rd ! |
Yes, I agree. However, we need to convert our customizations code to Python 3.. So for next months, if we like or not it, Python 2.7 still will stay with us. |
|
I get this error too, and fix it: |
This has been fixed in 1.10.11 - #8775 |
Apache Airflow 1.10.10:
Kubernetes version (if you are using kubernetes) (use
kubectl version
):Environment:
NAME="CentOS Linux"
VERSION="7 (Core)"
ID="centos"
ID_LIKE="rhel fedora"
VERSION_ID="7"
PRETTY_NAME="CentOS Linux 7 (Core)"
ANSI_COLOR="0;31"
CPE_NAME="cpe:/o:centos:centos:7"
HOME_URL="https://www.centos.org/"
BUG_REPORT_URL="https://bugs.centos.org/"
CENTOS_MANTISBT_PROJECT="CentOS-7"
CENTOS_MANTISBT_PROJECT_VERSION="7"
REDHAT_SUPPORT_PRODUCT="centos"
REDHAT_SUPPORT_PRODUCT_VERSION="7"
Kernel (e.g.
uname -a
):Linux mid1-t029nifi-1 3.10.0-327.28.3.el7.x86_64 Improving the search functionality in the graph view #1 SMP Thu Aug 18 19:05:49 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
Install tools:
pip, yum
Others:
What happened:
When dag serialisation is active, If I manually start an operator, the 1st one works fine, the next will fail with this error:
Could not queue task instance for execution, dependencies not met: Trigger Rule: Task's trigger rule 'all_success' requires all upstream tasks to have succeeded, but found 1 non-success(es). upstream_tasks_state={'skipped': Decimal('0'), 'successes': Decimal('0'), 'failed': Decimal('0'), 'upstream_failed': Decimal('0'), 'done': 0L, 'total': 1}, upstream_task_ids=set([u'query']
Settings dag serialisation to false the problem does not arise.
please note : Scheduler works fine.
What you expected to happen:
I expected to start manually all the dag's tasks from the 1st one to the last.
Code is not able to correctly find the task's status that is before the one I'm restarting.
If I start the 1st operator, anything works fine.
You can reproduce it following these steps:
op1 >> op2
Anything else we need to know:
This happens every time.
Mysql 5.7.x, Python 2.7
The text was updated successfully, but these errors were encountered: