Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Snowflake and BigQuery destination V2 #29683

Closed
wants to merge 11 commits into from

Conversation

evantahler
Copy link
Contributor

@evantahler evantahler commented Aug 21, 2023

@github-actions
Copy link
Contributor

github-actions bot commented Aug 21, 2023

Before Merging a Connector Pull Request

Wow! What a great pull request you have here! 🎉

To merge this PR, ensure the following has been done/considered for each connector added or updated:

  • PR name follows PR naming conventions
  • Breaking changes are considered. If a Breaking Change is being introduced, ensure an Airbyte engineer has created a Breaking Change Plan.
  • Connector version has been incremented in the Dockerfile and metadata.yaml according to our Semantic Versioning for Connectors guidelines
  • You've updated the connector's metadata.yaml file any other relevant changes, including a breakingChanges entry for major version bumps. See metadata.yaml docs
  • Secrets in the connector's spec are annotated with airbyte_secret
  • All documentation files are up to date. (README.md, bootstrap.md, docs.md, etc...)
  • Changelog updated in docs/integrations/<source or destination>/<name>.md with an entry for the new version. See changelog example
  • Migration guide updated in docs/integrations/<source or destination>/<name>-migrations.md with an entry for the new version, if the version is a breaking change. See migration guide example
  • If set, you've ensured the icon is present in the platform-internal repo. (Docs)

If the checklist is complete, but the CI check is failing,

  1. Check for hidden checklists in your PR description

  2. Toggle the github label checklist-action-run on/off to re-run the checklist CI.

Comment on lines 274 to 275
| 3.0.0 | 2023-08-27 | [xxx](https://github.com/airbytehq/airbyte/pull/xxx) | Destinations V2 |
| 2.0.0 | 2023-08-09 | [\#28894](https://github.com/airbytehq/airbyte/pull/29236) | Remove support for Snowflake GCS/S3 loading method in favor of Snowflake Internal staging |
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Of note: Snowflake + Destinations V2 is actually "v3.0.0"

@evantahler evantahler marked this pull request as ready for review August 21, 2023 21:09
@evantahler evantahler requested review from a team as code owners August 21, 2023 21:09
Comment on lines 12 to 16
public static boolean isDestinationV2() {
return DestinationConfig.getInstance().getBooleanValue("use_1s1t_format");
//TODO: Refactor this whole class away
// return DestinationConfig.getInstance().getBooleanValue("use_1s1t_format");
return true;
}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is smallest possible change to 'force' V2 destinations to be enabled.

@github-actions
Copy link
Contributor

destination-bigquery test report (commit a4657f1954) - ❌

⏲️ Total pipeline duration: 40mn53s

Step Result
Java Connector Unit Tests
Build connector tar
Build destination-bigquery docker image for platform linux/x86_64
Java Connector Integration Tests
Validate airbyte-integrations/connectors/destination-bigquery/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=destination-bigquery test

@github-actions
Copy link
Contributor

destination-snowflake test report (commit a4657f1954) - ❌

⏲️ Total pipeline duration: 32mn23s

Step Result
Java Connector Unit Tests
Build connector tar
Build destination-snowflake docker image for platform linux/x86_64
Java Connector Integration Tests
Validate airbyte-integrations/connectors/destination-snowflake/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=destination-snowflake test

@github-actions
Copy link
Contributor

destination-snowflake test report (commit b82ab8b780) - ❌

⏲️ Total pipeline duration: 24mn44s

Step Result
Java Connector Unit Tests
Build connector tar
Build destination-snowflake docker image for platform linux/x86_64
Java Connector Integration Tests
Validate airbyte-integrations/connectors/destination-snowflake/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=destination-snowflake test

Copy link
Contributor

@edgao edgao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do you also need to update run_with_normalization.sh to never run normalization? Or is that covered by the metadata change

also I think a lot of DATs are going to fail here, since they make assumptions about normalized tables existing.

@github-actions
Copy link
Contributor

destination-bigquery test report (commit b82ab8b780) - ❌

⏲️ Total pipeline duration: 36mn57s

Step Result
Java Connector Unit Tests
Build connector tar
Build destination-bigquery docker image for platform linux/x86_64
Java Connector Integration Tests
Validate airbyte-integrations/connectors/destination-bigquery/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=destination-bigquery test

@evantahler
Copy link
Contributor Author

do you also need to update run_with_normalization.sh to never run normalization? Or is that covered by the metadata change

Good point! The platform should be good about not running a normalization container now that the destinations no longer have a normalizationConfig metadata (https://github.com/airbytehq/airbyte-platform-internal/pull/8314), but the normal READ container might still get the use_1s1t_format option from an old connection's config file.

@edgao
Copy link
Contributor

edgao commented Aug 22, 2023

I just realized that we should make a second copy of that script, if we want to do beta periods for other connectors (so we would have an always-dv2 script, and a script that supports both modes). Problem for next week though!

@evantahler
Copy link
Contributor Author

evantahler commented Aug 22, 2023

You know what? The fine folks on Connector Ops already have us covered!

The build script only uses the /airbyte/run_with_normalization.sh entry point if context.connector.supports_normalization and DESTINATION_NORMALIZATION_BUILD_CONFIGURATION[context.connector.technical_name]["supports_in_connector_normalization"]. These new updates to the metadata don't pass those checks any more

@github-actions
Copy link
Contributor

destination-snowflake test report (commit 71475e9d00) - ❌

⏲️ Total pipeline duration: 27mn01s

Step Result
Java Connector Unit Tests
Build connector tar
Build destination-snowflake docker image for platform linux/x86_64
Java Connector Integration Tests
Validate airbyte-integrations/connectors/destination-snowflake/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=destination-snowflake test

@github-actions
Copy link
Contributor

destination-bigquery test report (commit 71475e9d00) - ❌

⏲️ Total pipeline duration: 41mn11s

Step Result
Java Connector Unit Tests
Build connector tar
Build destination-bigquery docker image for platform linux/x86_64
Java Connector Integration Tests
Validate airbyte-integrations/connectors/destination-bigquery/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=destination-bigquery test

@github-actions
Copy link
Contributor

destination-snowflake test report (commit 195ce56d4e) - ❌

⏲️ Total pipeline duration: 26mn18s

Step Result
Java Connector Unit Tests
Build connector tar
Build destination-snowflake docker image for platform linux/x86_64
Java Connector Integration Tests
Validate airbyte-integrations/connectors/destination-snowflake/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=destination-snowflake test

@github-actions
Copy link
Contributor

destination-bigquery test report (commit 195ce56d4e) - ❌

⏲️ Total pipeline duration: 37mn08s

Step Result
Java Connector Unit Tests
Build connector tar
Build destination-bigquery docker image for platform linux/x86_64
Java Connector Integration Tests
Validate airbyte-integrations/connectors/destination-bigquery/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=destination-bigquery test

@github-actions
Copy link
Contributor

destination-snowflake test report (commit 912e60c225) - ❌

⏲️ Total pipeline duration: 27mn42s

Step Result
Java Connector Unit Tests
Build connector tar
Build destination-snowflake docker image for platform linux/x86_64
Java Connector Integration Tests
Validate airbyte-integrations/connectors/destination-snowflake/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=destination-snowflake test

@github-actions
Copy link
Contributor

destination-bigquery test report (commit 912e60c225) - ❌

⏲️ Total pipeline duration: 44mn41s

Step Result
Java Connector Unit Tests
Build connector tar
Build destination-bigquery docker image for platform linux/x86_64
Java Connector Integration Tests
Validate airbyte-integrations/connectors/destination-bigquery/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=destination-bigquery test

@evantahler evantahler closed this Aug 24, 2023
@evantahler evantahler deleted the evan/bigquery-snowflake-v2 branch August 24, 2023 00:30
@evantahler
Copy link
Contributor Author

evantahler commented Aug 24, 2023

Closed in favor of #29783

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Dev images of v2.0.0 connectors to test
4 participants