-
Notifications
You must be signed in to change notification settings - Fork 16.7k
Description
Apache Airflow version:
1.10.10
Environment:
Docker-compose
What happened:
I used S3ToSwnoflakeTransfer and it gave me the following copy statement:
COPY INTO ods.stg_users
FROM @s3_airflow_ods/
files=airflow/sta_tables/import_users/users_20200911T112003de.csv
file_format=REDSHIFT_UNLOAD_FILE_FORMAT;
Which throws several errors. First, it says this:
SQL compilation error: syntax error line 3 at position 13 unexpected '/'. syntax error line 3 at position 13 unexpected '/'. syntax error line 3 at position 24 unexpected '/'. syntax error line 3 at position 37 unexpected '/'. syntax error line 3 at position 61 unexpected '.'. syntax error line 4 at position 0 unexpected 'file_format'.
If it would have worked, then an error of file_format would have arisen, because I have got to specify both data warehouse and schema.
Furthermore, once the problem with file_format is solved, it shows an error of Cannot perform operation. This session does not have a current schema
What you expected to happen:
The copy statement that works for me has been:
COPY INTO ods.stg_users
FROM @s3_airflow_ods/airflow/sta_tables/import_users/users_20200911T122850de.csv
file_format=dwh.dw.REDSHIFT_UNLOAD_FILE_FORMAT;
So for me, the copy statement has had to be changed from:
COPY INTO {schema}.{table}
FROM @{stage}/
files={s3_keys}
file_format={file_format}
to
USE SCHEMA {schema};
COPY INTO {schema}.{table}
FROM @{stage}/{s3_keys}
file_format={database}.{file_format_schema}.{file_format}
How to reproduce
Using a file_format that is in a different schema that the table to copy into.
@feluelle Could you take a look at this, please? As soon as you (or someone else) confirm this needs(or can) to be changed, I will start working on it