New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
413 Client Error: Payload Too Large when using upload_folder on a lot of files #918
Comments
CC @SBrandeis |
ah yes we probably want to chunk client-side in this use case (you're probably hitting POST limit size of 10MB). + enforce a reasonable total max size, regardless of chunking (maybe 100MB) Note that this only applies to non-LFS files so 100MB is more than reasonable IMO. Also cc @coyotte508 and @Pierrci for visibility |
I think the limit is already 100MB on the hub side. But since the python library is sending in base64 (to be able to send files with non-UTF8 characters) it's closer to 70~75MB max. What's the size of the files @nateraw ? |
@coyotte508 the 413 is thrown during the
|
Oh you should only send the first 512 bytes of data in the preupload call @SBrandeis |
In the web code: const res = await fetch(apiUrl("preupload"), {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
files: [
{
size: selectedFile.size,
// Base64 conversion of the first 512 bytes
sample: await blobToBase64(selectedFile.slice(0, 512)),
path: [path, encodeURIComponent(selectedFile.name)]
.filter(Boolean)
.join("/"),
},
],
} as PreuploadRequest),
}); |
Yes, we do that already: payload = {
"files": [
{
"path": op.path_in_repo,
"sample": base64.b64encode(op._upload_info().sample).decode("ascii"),
"size": op._upload_info().size,
"sha": op._upload_info().sha256.hex(),
}
for op in additions
]
}
|
Then likely there is so many files that the 250kB limit is overcome just with the preupload call. Either the hub library should batch the preupload calls (in chunks of 250 files for example) or we should allow a bigger body on the hub side |
@SBrandeis I am having the same error when trying to upload a folder of files with a total size of 350MB. What is the proposed approach for such case? |
@fcakyon the issue should be mostly fixed with recent versions of the hub library. Are you at the latest version? |
I am at the latest release |
ok 🤔 Feel free to share more details about the error, eg number of files in the folder, the request id if present, or detailed error message |
I am trying to upload a folder of 959 .mp4 video files with a total size of 297MB using This is the error traceback: 2022-11-12 19:48:36.535 Uncaught app exception
Traceback (most recent call last):
File "...\lib\site-packages\huggingface_hub\utils\_errors.py", line 213, in hf_raise_for_status
response.raise_for_status()
File "...\lib\site-packages\requests\models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 413 Client Error: Payload Too Large for url: https://huggingface.co/api/datasets/.../.../commit/main
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "...\lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 559, in _run_script
self._session_state.on_script_will_rerun(rerun_data.widget_states)
File "...\lib\site-packages\streamlit\runtime\state\safe_session_state.py", line 72, in on_script_will_rerun
self._state.on_script_will_rerun(latest_widget_states)
File "...\lib\site-packages\streamlit\runtime\state\session_state.py", line 542, in on_script_will_rerun
self._call_callbacks()
File "...\lib\site-packages\streamlit\runtime\state\session_state.py", line 555, in _call_callbacks
self._new_widget_state.call_callback(wid)
File "...\lib\site-packages\streamlit\runtime\state\session_state.py", line 277, in call_callback
callback(*args, **kwargs)
File "...\st_utils.py", line 66, in st_upload_folder_to_repo
upload_url = upload_folder_to_repo(**kwargs)
File "...\hf_utils.py", line 49, in upload_folder_to_repo
for folder_path in folder_paths:
File "...\lib\site-packages\huggingface_hub\utils\_validators.py", line 94, in _inner_fn
return fn(*args, **kwargs)
File "...\lib\site-packages\huggingface_hub\hf_api.py", line 2384, in upload_folder
commit_info = self.create_commit(
File "...\lib\site-packages\huggingface_hub\utils\_validators.py", line 94, in _inner_fn
return fn(*args, **kwargs)
File "...\lib\site-packages\huggingface_hub\hf_api.py", line 2074, in create_commit
hf_raise_for_status(commit_resp, endpoint_name="commit")
File "...\lib\site-packages\huggingface_hub\utils\_errors.py", line 254, in hf_raise_for_status
raise HfHubHTTPError(str(HTTPError), response=response) from e
huggingface_hub.utils._errors.HfHubHTTPError: <class 'requests.exceptions.HTTPError'> (Request ID: IEd7hJkhk5rcdQ777Idq3)
request entity too large |
@coyotte508 after updating to Do you have any ETA on releasing 0.11.0? It has been more than a month since the last huggingface-hub release. |
It should be soon!! cc @Wauplin |
@nateraw thanks a lot, snippet is very clear and simple! |
@fcakyon or use |
@julien-c thank you, good to know there is a pre-release version available! |
I'm closing this issue. Feel free to reopen it/open a new one if you still encounter an issue. |
@Wauplin amazing news! |
Describe the bug
When trying to commit a folder with many CSV files, I got the following error:
HTTPError: 413 Client Error: Payload Too Large for url: https://huggingface.co/api/datasets/nateraw/test-upload-folder-bug/preupload/main
I assume there is a limit to total payload size when uploading a folder that I am going over here. I confirmed it has nothing to do with the number of files, but rather the total size of the files that are being uploaded. It would be great in the short term if we could document what this limit is clearly in the
upload_folder
fn.Reproduction
The following fails on the last line. I wrote it so you can run it yourself without updating the repo ID or anything...so if you're logged in, the below should work (assuming you have torchvision installed).
Logs
The text was updated successfully, but these errors were encountered: