-
Notifications
You must be signed in to change notification settings - Fork 490
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The request body is too large and exceeds the maximum permissible limit #106
Comments
Can you upgrade to 0.9.9.10 and reproduce the issue? |
I tried with the version 0.9.9.9 (the one available at this moment). The structure of the folder I would like to upload is the following: /data/groups/folder/archive/file1 100 GB each one. And the layout I would like on the storage is the following in this case: Using --strip-components=3 azure blobxfer parameters [v0.9.9.9] platform: Linux-2.6.32-431.20.3.el6.x86_64-x86_64-with-redhat-6.5-Carbon script start time: 2016-02-01 10:03:40 md5: lhD55kDeLW9uh4PQ== xfer progress: [>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>] 100.00% 793.60 blocks/min 102400.0 MiB transfered, elapsed 1935.4741931 sec. Throughput = 423.255449709 Mbit/sec Is working fine at least for 1 file, but the structure on the storage is: Using --strip-components=4 I obtain the following: azure blobxfer parameters [v0.9.9.9] platform: Linux-2.6.32-431.20.3.el6.x86_64-x86_64-with-redhat-6.5-Carbon script start time: 2016-02-01 09:58:24 RequestBodyTooLarge The request body is too large and exceeds the maximum permissible limit.
RequestId:c7662108-0001-0111-2ccf-5cb68d000000 Thanks and regards |
Unfortunately, I cannot reproduce this error. Can you place this statement: |
The output: detected 0 empty files to upload RequestBodyTooLarge The request body is too large and exceeds the maximum permissible limit.
RequestId:3309da30-0001-0000-78be-5dc7c3000000 |
Thanks for running the script with the modification. According to the new debug lines, the data being sent to the Azure Python Storage SDK is consistent with the maximum allowable block size of 4MB. I have two suggestions:
|
Using the second option, pass --chunksizebytes 4194296 as a parameter works fine. I will try also using a SAS key, but I think that for me the best solution is change the value _MAX_BLOB_CHUNK_SIZE_BYTES to 4194296 as you suggested If this error is related with the SDK, also I will try the next release of the SDK as soon as it will be available. Thanks and regards |
Thanks for working through the issue. If you want, you can directly raise this issue to the Azure Python Storage SDK github repo. I will close this issue. |
Using the python blobxfer, many times and with many large files (but always smaller than 190G), I receive the error "The request body is too large and exceeds the maximum permissible limit"
I can upload some large files, but is not possible for others files, doesn't matter how many retries or the number of workers.
Also, I have this problem when the local resource to upload is a folder (a folder containing one single file), but if I try to upload the same file specifying the file, the error doesn't appear.
I am using the following python packages:
pip freeze
azure==1.0.2
azure-common==1.0.0
azure-mgmt==0.20.1
azure-mgmt-common==0.20.0
azure-mgmt-compute==0.20.0
azure-mgmt-network==0.20.1
azure-mgmt-nspkg==1.0.0
azure-mgmt-resource==0.20.1
azure-mgmt-storage==0.20.0
azure-nspkg==1.0.0
azure-servicebus==0.20.1
azure-servicemanagement-legacy==0.20.1
azure-storage==0.20.2
elasticsearch==2.2.0
futures==3.0.3
python-dateutil==2.4.2
requests==2.9.1
six==1.10.0
urllib3==1.14
wheel==0.26.0
And blobxfer.py v0.9.9.5
As an example:
azure blobxfer parameters [v0.9.9.5]
subscription id: None
management cert: None
transfer direction: local->Azure
local resource: archive/
remote resource: None
max num of workers: 4
timeout: None
storage account: ---
use SAS: False
upload as page blob: False
auto vhd->page blob: False
container: ---
blob container URI: https://---.blob.core.windows.net/---
compute file MD5: True
skip on MD5 match: True
chunk size (bytes): 4194304
create container: True
keep mismatched MD5: False
recursive if dir: True
keep root dir on up: False
collate to: disabled
script start time: 2016-01-29 10:00:09
g--.tar.gz md5: lhD55kDeLW9uh4PXtJ7LhQ==
detected 0 empty files to upload
performing 25600 put blocks/blobs and 1 put block lists
xfer progress: [ ] 0.00% 0.00 blocks/min The request body is too large and exceeds the maximum permissible limit.
RequestBodyTooLarge
The request body is too large and exceeds the maximum permissible limit.RequestId:a6af7e74-0001-00f6-0474-5ae0d5000000
Time:2016-01-29T09:06:25.4964043Z100000
ls -lah
-rw-r--r-- 1 root root 100G Jan 26 15:46 g--.tar.gz
The text was updated successfully, but these errors were encountered: