Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Snowflake destination: set query_tag in the UI #9467

Closed
pjleiwa6g opened this issue Jan 13, 2022 · 6 comments
Closed

Snowflake destination: set query_tag in the UI #9467

pjleiwa6g opened this issue Jan 13, 2022 · 6 comments

Comments

@pjleiwa6g
Copy link

pjleiwa6g commented Jan 13, 2022

Tell us about the problem you're trying to solve

I would like to know which queries were run by Airbyte in Snowflake. This can be used to determine costs and solve issues.

Describe the solution you’d like

In Snowflake a session parameter called 'query_tag' can be set. At this moment a fixed value for 'query_tag' is set when use the basic normalization option. The value then is: 'query_tag=normalization'. I would like this option to be extended:

  • The value for the query_tag should be set in the UI, in the config parameters of the Snowflake destination connector.
  • Multiple values should be entered like this: {"tool":"airbyte","datasource":"applicationX"}
  • I would be nice if the name of the source connector can be used as an variable, so it can be passed to the query_tag.

Describe the alternative you’ve considered or used

An alternative is to set the parameter value on user level. This means an extra user should be created. And If you want have different query_tags per data source, multiple users should be created.

Additional context

See screenshot for an example

airbyte!UNITO-UNDERSCORE!query!UNITO-UNDERSCORE!tag

┆Issue is synchronized with this Asana task by Unito

@RogerDataNL
Copy link

Being able to use query tags to dissect for which connector / customer / source we are running airbyte for is important for us too!

@alafanechere alafanechere changed the title Snowflake destination: Have the Snowflake query_tag set in the UI Snowflake destination: set query_tag in the UI Jan 13, 2022
@noahkawasakigoogle
Copy link
Contributor

noahkawasakigoogle commented Jan 14, 2022

I'd actually suggest broadening this issue to be "Snowflake Destination/Source: Allow additional JDBC parameters" so users can input any additional JDBC parameters.

This would mean anything from Snowflake's JDBC Driver Parameters and Snowflake Parameter can work, which gives users way more power to customize their connection abilities (which saves you time from more requests from adding a UI input for every particular param someone might need). I tested with QUERY_TAG as a JDBC string param and it went though

This would support setting QUERY_TAG through an additional params input instead. There would just need to be a quick check here: https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/bases/base-normalization/normalization/transform_config/transform.py#L207

To first see if the config contains (case-insensitive) the QUERY_TAG key and only set Airbyte's default if its empty.

EDIT: Okay, I suggested this because I assumed Airbyte's Source/Destination objects had an input for adding additional parameters. After looking more carefully at some implementations I have found this is not true. I linked another issue I found which would solve this one

@noahkawasakigoogle
Copy link
Contributor

Working on this here: https://github.com/airbytehq/airbyte/pull/9623/files

First time I've done something with connectors for Airbyte so it might take a bit to make sure im doing everything needed

@grishick
Copy link
Contributor

Hey team! Please add your planning poker estimate with ZenHub @tuliren @edgao @subodh1810

@sashaNeshcheret
Copy link
Contributor

Seems the issues was fixed by PR, @pjleiwa6g please confirm it. @grishick this issue may be closed as fixed.

@pjleiwa6g
Copy link
Author

@sashaNeshcheret Yes, its fixed by #9623

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Prioritized for scoping
Development

No branches or pull requests

9 participants