-
Notifications
You must be signed in to change notification settings - Fork 16.3k
[11912] [skip ci] Add how-to Guide for Azure operators #17355
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| # :type options: dict | ||
| # :param azure_data_explorer_conn_id: Reference to the | ||
| # :ref:`Azure Data Explorer connection<howto/connection:adx>`. | ||
| # :type azure_data_explorer_conn_id: str |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maintaining the above parameters here would be problematic. For example, if there's an addition to the params, we may forget to update the param list here
ephraimbuddy
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should make each example have a system test too; that way, we can confirm that the example works.
Here's an example system test: https://github.com/apache/airflow/blob/main/tests/providers/microsoft/azure/operators/test_adls_delete_system.py
So I suggest we take this, one example and a system test at a time
|
I am back from a work and travel hiatus. If still acceptable, maintainers may re-assign this issue to me. |
|
You’re still assigned in the issue, please feel free to continue. |
|
@pyerbiz Do you need any help with this issue |
|
@kaxil Not right now. I'll try to this weekend and seek help if I need it. |
…Operator. * created files * added s3tosftp and examples * vault var removed * updated s3_to_sftp example and doc * fixed directives * ref error fixed * ran static checks
|
@kaxil I had a question about the example_dag for adx.py. I am using gcp's big query docs for reference and also the postgres operator doc. Based on them, to make a complete example I should be able to create table, load data, query data, and drop table. I have to tried to do that in kusto above. I am concerned about the connection_string_id used to connect to ADX cluster. When I access adx with python in my local machine, I have to do the browser pop-up method, credential based login -- other methods have caused me problems in the past. I'm going to try using the AAD token method again and see if that works without pop-ups. Is there a preferred way users do adx authentication on airflow? I ask because the other example_dag files for azure all use a default "default_connection_id" string to show the example. |
|
This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 5 days if no further activity occurs. Thank you for your contributions. |
Adding documentation and example dags for missing cases in Azure Providers.
closes: #11912
related: