You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Not sure if this should be a bug or a feature request.
When using function to_parquet with partitions and custom projection_storage_location_template, the template is ignored and files are always written to the default path ( .../partition_column=value/... ).
How to Reproduce
wr.s3.to_parquet(
df=data,
path=s3_folder, # 's3//bucket/prefix/'
dataset=True,
database=athena_db,
table=table_name,
partition_cols= ['my_column'],
mode='append',
glue_table_settings={
'regular_partitions' : False
},
athena_partition_projection_settings={ # this is currently not working as the wrangler hardcodes the paths
'projection_storage_location_template' : f'{s3_folder}${{my_column}}/',
}
)
Expected behavior
Files should be written to s3 following the provided format.
In the sample code this should be to path:
's3://bucket/prefix/VALUE/HASH.parquet'
as opposed to:
's3://bucket/prefix/my_column=VALUE/HASH.parquet'
Your project
No response
Screenshots
No response
OS
mac
Python version
3.11.4
AWS SDK for pandas version
3.5.2
Additional context
No response
The text was updated successfully, but these errors were encountered:
Hi @bucefalog the template is only used for partition projection, unfortunately it does not impact how awswrangler creates partitions. Only Hive-style partitions are supported at the moment.
Describe the bug
Not sure if this should be a bug or a feature request.
When using function to_parquet with partitions and custom projection_storage_location_template, the template is ignored and files are always written to the default path ( .../partition_column=value/... ).
How to Reproduce
Expected behavior
Files should be written to s3 following the provided format.
In the sample code this should be to path:
's3://bucket/prefix/VALUE/HASH.parquet'
as opposed to:
's3://bucket/prefix/my_column=VALUE/HASH.parquet'
Your project
No response
Screenshots
No response
OS
mac
Python version
3.11.4
AWS SDK for pandas version
3.5.2
Additional context
No response
The text was updated successfully, but these errors were encountered: