Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Join PipeHistory queries with pipes for additional tags #16222

Merged
merged 16 commits into from
Dec 7, 2023

Conversation

yogevyuval
Copy link
Contributor

@yogevyuval yogevyuval commented Nov 15, 2023

What does this PR do?

This PR adds 2 additional tags to the PipeHistory query

Motivation

Without these tags, these metrics are almost useless. Imagine the following:
2 schemas, one is dev env, one is production. Each schema has a pipe named “INPUT_PIPE” that ingests data into snowflake.
Without these tags, the metric is useless, because it aggregates across non related schemas.

  • Feature or bugfix MUST have appropriate tests (unit, integration, e2e)
  • Changelog entries must be created for modifications to shipped code
  • Add the qa/skip-qa label if the PR doesn't need to be tested during QA.

@yogevyuval yogevyuval requested a review from a team as a code owner November 15, 2023 20:06
Copy link

codecov bot commented Nov 15, 2023

Codecov Report

Merging #16222 (1d0ebab) into master (a6de287) will increase coverage by 0.09%.
Report is 1 commits behind head on master.
The diff coverage is n/a.

Additional details and impacted files
Flag Coverage Δ
activemq ?
cassandra ?
hive ?
hivemq ?
ignite ?
jboss_wildfly ?
kafka ?
presto ?
snowflake 95.95% <ø> (ø)
solr ?
tomcat ?

Flags with carried forward coverage won't be shown. Click here to find out more.

Copy link

github-actions bot commented Nov 15, 2023

Test Results

  2 files    2 suites   12s ⏱️
58 tests 58 ✔️ 0 💤 0
60 runs  58 ✔️ 2 💤 0

Results for commit 1d0ebab.

♻️ This comment has been updated with latest results.

@yogevyuval yogevyuval requested a review from a team as a code owner November 15, 2023 20:15
@ghost ghost added the documentation label Nov 15, 2023
hestonhoffman
hestonhoffman previously approved these changes Nov 15, 2023
@yogevyuval
Copy link
Contributor Author

@hestonhoffman Thanks
First contriubtion for me here - how do I fix the changelog parts? + not sure about the tests here, will they fail if the query is wrong?

@hestonhoffman
Copy link
Contributor

@hestonhoffman Thanks First contriubtion for me here - how do I fix the changelog parts? + not sure about the tests here, will they fail if the query is wrong?

That's a good question for @DataDog/agent-integrations :)


***Added***:

* Added `pipe_schema` and `pipe_catalog` tags to snowpipe metrics. ([#16222](https://github.com/DataDog/integrations-core/pull/16222))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@yogevyuval would you mind undoing this change, updating your ddev to the latest version and running ddev release changelog new? We recently reworked the way we generate changelogs.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@iliakur Done. Can you take a look again?
Also - anyone can help with validating and tests?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, not sure why this file got added. Was this the result of running the ddev command?

Could you post the output of git status here? After we remove this file, the PR lgtm!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like a mistake, i removed it.
Any way to confirm my query does not break anything?

@iliakur iliakur removed the ddev label Nov 21, 2023
@yogevyuval
Copy link
Contributor Author

@iliakur What's next needed here? Anyway to validate the query?

@iliakur
Copy link
Contributor

iliakur commented Nov 24, 2023

@yogevyuval sorry about the delay. I'm about to go on holiday, my colleague @alopezz will take over next week. Have a good weekend and thanks again for your patience!

Copy link
Contributor

@alopezz alopezz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sadly, we don't have an automated way of testing the integration against an actual instance of Snowflake. If you want, you can add a test with a mocked response such as those on https://github.com/DataDog/integrations-core/blob/master/snowflake/tests/test_queries.py, but since I don't see one already in place for this query, I'm not going to block the PR based on that.

I'm going to test the PR out manually to make sure the query is valid before approval.

snowflake/datadog_checks/snowflake/queries.py Outdated Show resolved Hide resolved
snowflake/datadog_checks/snowflake/queries.py Outdated Show resolved Hide resolved
yogevyuval and others added 2 commits November 27, 2023 10:48
Co-authored-by: Alex Lopez <alex.lopez.zorzano@gmail.com>
Co-authored-by: Alex Lopez <alex.lopez.zorzano@gmail.com>
@yogevyuval
Copy link
Contributor Author

Sadly, we don't have an automated way of testing the integration against an actual instance of Snowflake. If you want, you can add a test with a mocked response such as those on https://github.com/DataDog/integrations-core/blob/master/snowflake/tests/test_queries.py, but since I don't see one already in place for this query, I'm not going to block the PR based on that.

I'm going to test the PR out manually to make sure the query is valid before approval.

Fixed the PR comments (and adjusted the changelog entry accordingly).
Appreciate you helping out with manual testing - let me know if everything worked our or if additional changes are needed!

@yogevyuval
Copy link
Contributor Author

@alopezz Did you have a chance to run this? Anything else from my side needed?

@alopezz
Copy link
Contributor

alopezz commented Dec 5, 2023

Sorry for the delay. I was trying to get the necessary data into a test account to get some data from the modified query but I haven't been able to do so. The query itself is fine in that it doesn't produce an error, I'm simply not getting any results for extra confirmation.

Would you be able to share some example results of that query? Since you've found the need to improve the query I suppose you should be able to get some data from it. I can then do some final testing out of that.

@yogevyuval
Copy link
Contributor Author

Sorry for the delay. I was trying to get the necessary data into a test account to get some data from the modified query but I haven't been able to do so. The query itself is fine in that it doesn't produce an error, I'm simply not getting any results for extra confirmation.

@alopezz
Yes, I ran this query on my account and it worked well. The reason I was a bit worried is actually things like specifying the database + schema before the table name or not, because for example in the original code it uses FROM pipe_history without any db or schema, and in the join I added I specified snowflake.account_usage.pipes.

So I wanted to make the query actually compiles well, no permission errors or other things like that

@alopezz
Copy link
Contributor

alopezz commented Dec 7, 2023

@alopezz Yes, I ran this query on my account and it worked well. The reason I was a bit worried is actually things like specifying the database + schema before the table name or not, because for example in the original code it uses FROM pipe_history without any db or schema, and in the join I added I specified snowflake.account_usage.pipes.

So I wanted to make the query actually compiles well, no permission errors or other things like that

I see what you mean (I hadn't payed close enough attention to that); technically it shouldn't matter because that query is always meant to be run against the SNOWFLAKE database and is only available when the schema is set to ACCOUNT_USAGE(see config), so both qualifying and not qualifying the table name works. You might want to remove the qualification from the reference to the pipes for consistency with the rest of the existing queries, other than that I consider this ready to merge.

@yogevyuval
Copy link
Contributor Author

@alopezz Yes, I ran this query on my account and it worked well. The reason I was a bit worried is actually things like specifying the database + schema before the table name or not, because for example in the original code it uses FROM pipe_history without any db or schema, and in the join I added I specified snowflake.account_usage.pipes.
So I wanted to make the query actually compiles well, no permission errors or other things like that

I see what you mean (I hadn't payed close enough attention to that); technically it shouldn't matter because that query is always meant to be run against the SNOWFLAKE database and is only available when the schema is set to ACCOUNT_USAGE(see config), so both qualifying and not qualifying the table name works. You might want to remove the qualification from the reference to the pipes for consistency with the rest of the existing queries, other than that I consider this ready to merge.

@alopezz I removed the qualifications

Copy link
Contributor

@alopezz alopezz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice, thanks for your patience!

@alopezz alopezz merged commit 563ef4d into DataDog:master Dec 7, 2023
34 checks passed
@yogevyuval yogevyuval mentioned this pull request Dec 26, 2023
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants