Skip to content

Conversation

@aa3pankaj
Copy link
Contributor

  • Adding new transfer operator for Snowflake to S3
  • Adding on_error option in S3 to snowflake
  • Updates related to query_ids and execution_info

^ Add meaningful description above

Read the Pull Request Guidelines for more information.
In case of fundamental code change, Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in UPDATING.md.

@boring-cyborg boring-cyborg bot added area:providers provider:snowflake Issues related to Snowflake provider labels Dec 22, 2021
@boring-cyborg
Copy link

boring-cyborg bot commented Dec 22, 2021

Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
Here are some useful points:

  • Pay attention to the quality of your code (flake8, mypy and type annotations). Our pre-commits will help you with that.
  • In case of a new feature add useful documentation (in docstrings or in docs/ directory). Adding a new operator? Check this short guide Consider adding an example DAG that shows how users should use it.
  • Consider using Breeze environment for testing locally, it’s a heavy docker but it ships with a working Airflow and a lot of integrations.
  • Be patient and persistent. It might take some time to get a review or get the final approval from Committers.
  • Please follow ASF Code of Conduct for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack.
  • Be sure to read the Airflow Coding style.
    Apache Airflow is a community-driven project and together we are making it better 🚀.
    In case of doubts contact the developers at:
    Mailing List: dev@airflow.apache.org
    Slack: https://s.apache.org/airflow-slack

@aa3pankaj aa3pankaj changed the title Adding Snowflake to S3 transfer operator, Updates in S3 to snowflake Adding snowflake_to_s3 transfer operator, Updates in s3_to_snowflake Dec 22, 2021
@aa3pankaj aa3pankaj marked this pull request as ready for review December 22, 2021 10:33
@mik-laj
Copy link
Member

mik-laj commented Dec 22, 2021

Why just SnowflakeOperator is not enough? The operator logic is quite simple and I think we can expect the user to just copy the SQL query from the Snowflake documentation.

@aa3pankaj
Copy link
Contributor Author

aa3pankaj commented Dec 22, 2021

Why just SnowflakeOperator is not enough? The operator logic is quite simple and I think we can expect the user to just copy the SQL query from the Snowflake documentation.

@mik-laj We need this for the same reason why we have the SnowflakeOperator when we can use SnowflakeHook directly to run queries.

And S3ToSnowflake related changes are required since we already have this operator merged, and currently query_ids and execution_info not exposed from this operator.

@aa3pankaj
Copy link
Contributor Author

@mik-laj @potiuk @turbaszek any update on this PR?
This consists of changes in already merged operator S3ToSnowflake also, could you guys please review the changes?

@JavierLopezT
Copy link
Contributor

@aa3pankaj I already tried to merge the SnowflakeToS3Operator and it seems that you would have to create a generic operator SnowflakeToStorageOperator. You can read the comments of my already closed pul request: #14415

)

sql_parts = [
f"COPY INTO @{self.stage}/{self.prefix or ''}",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Stage is the identifier of the object, so it can contain spaces. We should try to ensure that this identifier is safely passed to the query being built, e.g. the use of spaces in the identifier does not cause problems. See: https://docs.snowflake.com/en/sql-reference/identifiers-syntax.html

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here we expect user to pass valid identifier, anyway do you expect SnowflakeToS3Operator to validate stage name using some regex?

@pytest.mark.parametrize("overwrite", [None, True, False])
@pytest.mark.parametrize("single", [None, True, False])
@mock.patch("airflow.providers.snowflake.hooks.snowflake.SnowflakeHook.run")
def test_execute(self, mock_run, schema, prefix, unload_sql, on_error, header, overwrite, single):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand this test. This doesn't test almost anything but is mostly a duplicate of operator logic. We should copy a few generated statements instead of repeating the statement generating code.

@github-actions
Copy link

This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 5 days if no further activity occurs. Thank you for your contributions.

@github-actions github-actions bot added the stale Stale PRs per the .github/workflows/stale.yml policy file label Feb 12, 2022
@aa3pankaj aa3pankaj closed this Feb 17, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:providers provider:snowflake Issues related to Snowflake provider stale Stale PRs per the .github/workflows/stale.yml policy file

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants