Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Checking the signature of
triggering_dataset_events
in airflow/utils/context.pyi is defined with typingMapping[str, Collection[DatasetEvent | DatasetEventPydantic]]
.I assume the example is also after correction not correct, it rather needs to be (see airflow/models/dataset.py:275):
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Both are correct:
Running:
results in:
(Granted my test dag is
producer_dag
and notload_snowflake_data
from the demo.I'm open to either but I like showing
source_dag_run
because it can also hint at getting items like.source_dag_run.data_interval_start
which could be useful for the consumer dagThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Aaah, yes, you are right. I over-looked /home/jscheffl/Workspace/airflow/airflow/models/dataset.py:304 where the relation is modelled. Danger-zone is that from typing it can be also class of type
DatasetEventPydantic
if airflow switches to use internal API and this does not carry the relation parameter. Internal API is coming, adding a bit of more API complexity here for the future benefit of multi tenancy.