-
Notifications
You must be signed in to change notification settings - Fork 722
Closed
Labels
bugSomething isn't workingSomething isn't workingenhancementNew feature or requestNew feature or request
Description
Is it possible to have ability similar to function below:
gluecontext.write_dynamic_frame.from_jdbc_conf() as below?
datasink4 = glueContext.write_dynamic_frame.from_jdbc_conf(frame = datasource0, catalog_connection = "test_red", connection_options = {"preactions":"truncate table target_table;","dbtable": "target_table", "database": "redshiftdb"}, redshift_tmp_dir = 's3://s3path', transformation_ctx = "datasink4")
Currently the way we do is:
- Get SQL from S3 file and pass into pandas.read_sql_athena()
- use SQLAlchemy to execute preactions SQL. For our case is delete before load
- use SQLAlchemy and pandas.to_sql() to append dataframe into aurora table
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workingenhancementNew feature or requestNew feature or request