-
Notifications
You must be signed in to change notification settings - Fork 13.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nested templated variables do not always render #13559
Comments
Thanks for opening your first issue here! Be sure to follow the issue template! |
This seems to be also issue in 2.0:
|
I don't know if this is a bug or by design but there are many votes for this in https://stackoverflow.com/questions/44855949/make-custom-airflow-macros-expand-other-macros |
I'm afraid that this is not simple to fix. The main problem I see is that template fields are rendered in the order they are defined in |
One possibility here is using the previous state of each attribute and rendering each templated_field until all fields are stable. |
I agree with @turbaszek on the potential complexity of "fixing" this. What I'm thinking is: if we can have different method (other than templating) to achieve the same functionality/purpose, we don't have to bother with it. Correct me if I'm wrong please. |
I was having a similar problem: I wanted to pass a templated argument to a function, but it kept getting passed as raw template instead of rendered one. The solution is to call the macro inside a pythonoperator & pass the data using xcoms.
|
If we support nested rendering we’d also need to worry about circular reference, which would require a significantly more elaborated algorithm to properly handle. Dependency resolution is hell, don’t go there if at all possible. |
I looked at this and the problem turned out to be a bit different. True - we should not even attempt to solve the recursive rendering of fields cross-referencing each other. But this was not the case. This was a I looked at it and it turned out, this was a problem introduced by #8805 - where instead of the original task we started to use a This means that if someone would use the I fixed that in the #18516 by replacing the task in context with the same locked-down copy - not sure if this might have other side effects (but I think it could only be positive side effects :D ). |
With the change from #18516, I got this: |
When we are referring ``{{task}}` in jinja templates we can also refer some of the fields, which are templated. We are not able to solve all the problems with such rendering (specifically recursive rendering of the fields used in JINJA templating might be problematic. Currently whether you see original, or rendered field depends solely on the sequence in templated_fields. However that would not even explain the rendering problem described in apache#13559 where kwargs were defined after opargs and the rendering of opargs **should** work. It turned out that the problem was with a change introduced in apache#8805 which made the context effectively holds a DIFFERENT task than the current one. Context held an original task, and the curren task was actually a locked copy of it (to allow resolving upstream args before locking). As a result, any changes done by rendering templates were not visible in the task accessed via {{ task }} jinja variable. This change replaces the the task stored in context with the same copy that is then used later during execution so that at least the "sequential" rendering works and templated fields which are 'earlier' in the list of templated fields can be used (and render correctly) in the following fields. Fixes: apache#13559
When we are referring ``{{task}}` in jinja templates we can also refer some of the fields, which are templated. We are not able to solve all the problems with such rendering (specifically recursive rendering of the fields used in JINJA templating might be problematic. Currently whether you see original, or rendered field depends solely on the sequence in templated_fields. However that would not even explain the rendering problem described in #13559 where kwargs were defined after opargs and the rendering of opargs **should** work. It turned out that the problem was with a change introduced in #8805 which made the context effectively holds a DIFFERENT task than the current one. Context held an original task, and the curren task was actually a locked copy of it (to allow resolving upstream args before locking). As a result, any changes done by rendering templates were not visible in the task accessed via {{ task }} jinja variable. This change replaces the the task stored in context with the same copy that is then used later during execution so that at least the "sequential" rendering works and templated fields which are 'earlier' in the list of templated fields can be used (and render correctly) in the following fields. Fixes: #13559
"{{var.value.get("const_"+dag_run.conf.get("arg1", ""))}}" this works for me. here I am getting arg1 one as argument in dag run and creating variable at run time then accessing variable from var. |
Apache Airflow version:
1.10.14 and 1.10.8.
Environment:
Python 3.6 and Airflow 1.10.14 on sqllite,
What happened:
Nested jinja templates do not consistently render when running tasks. TI run rendering behavior also differs from airflow UI and airflow render cli.
What you expected to happen:
Airflow should render nested jinja templates consistently and completely across each interface. Coming from airflow 1.8.2, this used to be the case.
This regression may have been introduced in 1.10.6 with a refactor of BaseOperator templating functionality.
#5461
Whether or not a nested layer renders seems to differ based on which arg is being templated in an operator and perhaps order. Furthermore, it seems like the render cli and airflow ui each apply TI.render_templates() a second time, creating inconsistency in what nested templates get rendered.
There may be bug in the way BaseOperator.render_template() observes/caches templated fields
How to reproduce it:
From the most basic airflow setup
nested_template_bug.py
The text was updated successfully, but these errors were encountered: