-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
S3: Should put_object automatically split files larger than 5GB? #1123
Comments
The |
I love you, I love this tool, you make my life easy |
Guys, so according to you then the multipart upload is driven by boto3 automatically? it's not required an upload ID to gather the parts upload of the file in AWS S3? @jamesls I was following the documentation and then seems like the multipart in Python API is only used for glacier? @kopertop |
ClientError: An error occurred (EntityTooLarge) when calling the PutObject operation: Your proposed upload exceeds the maximum allowed size i am trying to download the coco dataset to s3 bucket and i get this error . how do i solve it? the codes i've written are correct. `%%time role = get_execution_role() bucket='masterthesisvyvian' # customize to your bucket containers = {'us-west-2' : '433757028032.dkr.us-west-2.amazonaws.com/image-classification:latest', training_image = containers[boto3.Session().region_name] print(training_image) import os def download(url): def upload_to_s3(channel, file): download('http://images.cocodataset.org/zips/train2017.zip') |
S3 doesn't allow you to PUT files more than 5gb at a time. However, boto3 will allow me to run something like :
After a minute or so it throws:
botocore.exceptions.ClientError: An error occurred (EntityTooLarge) when calling the PutObject operation: Your proposed upload exceeds the maximum allowed size
I must use
split
in order to upload my large files! Should boto3 handle this for the user?The text was updated successfully, but these errors were encountered: