-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Send Slack alerts for DAG failures #249
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just curious, to cut down the code duplication throughout, would it make sense to have a utility function/class and import it into these DAGs instead?
@josh-fell I thought about it, but wondered what would be a good place to keep such a util in the repo. Then thought to keep it in the DAG to make the DAG self-sufficient 🙂 Would creating a new dir |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You might want to rebase this PR.
@josh-fell I thought about it, but wondered what would be a good place to keep such a util in the repo. Then thought to keep it in the DAG to make the DAG self-sufficient 🙂
Would creating a new dir
utils
underairflow/include
make sense and keeping it in there?
@pankajkoti @josh-fell we could have a monitoring
folder inside airflow/include
. We should use taskflow API which calls this common method.
@sunank200 We're using on_failure_callback at the DAG level to leverage Slack Notifier which is already a parameter for the dag decorator. I don't think I got what you mean by Taskflow API. Please explain. |
to notify DAG failures to the desired alerts Slack channel by leveraging the Slack Notifier from the Apache Airflow Slack provider. For the alerts to work, it will need a connection of type slack created in the deployment. If the connection ID for this connection is different than the default slack_api_default, then the connection ID needs to be set in the deployment as an environment variable named ASK_ASTRO_ALERT_SLACK_CONN_ID. closes: #231
b97668f
to
0ce3d12
Compare
Deploying with Cloudflare Pages
|
@josh-fell Thanks for the suggestion. I created the required utility in @sunank200 I have resolved the conflicts but I did not understand the second point in your previous comment. Requesting re-reviews please. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM 👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this approach looks good to me. I thought you would be using some other custom method for this implementation which would look for any error in the DAG run.
Have you tested this implementation? If yes i think its good to be merged.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
The
Yes, and I did paste a sample alert here #249 (comment). @josh-fell I am merging the PR, appreciate your review and please add more comments if you would have, I will address them in a subsequent PR. |
@pankajkoti This is exactly what I was envisioning. Looks great! |
Thanks @josh-fell , appreciate you review 🙏🏽 |
The PR adds
on_failure_callback
for all the Ask Astro DAGsto notify DAG failures to the desired alerts Slack channel by leveraging
the Slack Notifier from the Apache Airflow Slack provider.
For the alerts to work, it will need a connection of type
slack
created in the deployment. If the connection ID for this connection
is different than the default
slack_api_default
, then theconnection ID needs to be set in the deployment as an environment
variable named
ASK_ASTRO_ALERT_SLACK_CONN_ID
.closes: #231