This is a PipelineWise compatible tap connector.
The recommended method of running this tap is to use it from PipelineWise. When running it from PipelineWise you don't need to configure this tap with JSON files and most of things are automated. Please check the related documentation at Tap Snowflake
If you want to run this Singer Tap independently please read further.
It's recommended to use a virtualenv:
- Create a
config.jsonfile with connection details to snowflake, here is a sample config file.
tables is a mandatory parameter as well to avoid a long-running catalog discovery process.
Please specify fully qualified table and view names and only that ones that you need to extract otherwise you can
end up with very long running discovery mode of this tap. Discovery mode is analysing table structures but
Snowflake doesn't like selecting lot of rows from
INFORMATION_SCHEMA or running
SHOW commands that returns lot of
rows. Please be as specific as possible.
Run it in discovery mode to generate a
properties.jsonand select the streams to replicate
Run the tap like any other singer compatible tap:
tap-snowflake --config config.json --properties properties.json --state state.json
You can either use basic user/password authentication or Key Pair authentication.
password in the
To use key pair authentication, omit the
password and instead provide the
private_key_path to the unencrypted version of the private key and, optionally, the
The tap can be invoked in discovery mode to find the available tables and columns in the database:
$ tap-snowflake --config config.json --discover
A discovered catalog is output, with a JSON-schema description of each table. A source table directly corresponds to a Singer stream.
The two ways to replicate a given table are
Full-table replication extracts all data from the source table each time the tap is invoked.
Incremental replication works in conjunction with a state file to only extract new records each time the tap is invoked. This requires a replication key to be specified in the table's metadata as well.
- Define environment variables that requires running the tests
export TAP_SNOWFLAKE_ACCOUNT=<snowflake-account-name> export TAP_SNOWFLAKE_DBNAME=<snowflake-database-name> export TAP_SNOWFLAKE_USER=<snowflake-user> export TAP_SNOWFLAKE_PASSWORD=<snowflake-password> export TAP_SNOWFLAKE_PRIVATE_KEY_PATH=<snowflake-pk-path> export TAP_SNOWFLAKE_PRIVATE_KEY_PASSPHRASE=<snowflake-passphrase> export TAP_SNOWFLAKE_WAREHOUSE=<snowflake-warehouse>
- Install python dependencies
- To run unit tests:
PS: There are no unit tests at the time of writing this document
- To run Integration tests
make venv format pylint
Apache License Version 2.0
See LICENSE to see the full text.