New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
upload EntityTooSmall #1075
Comments
Hey @lackray thanks a lot for the thorough description. I think the solution might be easier than we think. We have an env variable that control the chunk size for large uploads on the UI: diffgram/shared/settings/settings.py Line 179 in 93e1b4b
You can set this env variable to a larger size and hopefully this error should be gone. |
It didnt work.I add 'LARGE_API_CHUNK_SIZE=15' inside |
One more question are you using the UI or the Python SDK for upload? If its the SDK can you share the code snippet? I'll be taking a look and get back to you with a solution. |
Im using the Python SDK for uploading point cloud file,which is the only way to upload right now .Same code snippet as https://diffgram.readme.io/docs/3d-lidar-annotation-guide (upload_3d_file_to_diffgram), im using the example code.My python console just returns 'internal error' for such error,so i have to check with docker-walrus console output,which i mentioned above. |
Thanks for the info! I think the problem might be with the SDK. Previous version had a chunk size of 5mb for the 3D uploads by default. I have fixed this and added a Can you try doing:
And trying the upload again by re running the script? |
Can you share your full script please? That way we can fully replicate the error. It would also help if you can see the final JSON to see if there is any issue with the JSON being generated by the client |
here is my code:
|
Hey @lackray, Thanks for all the details on this. I've identify a bug on the chunking process for MinIO specifically. So I deployed v |
After #1070,now i can upload.During uploading,comes another exception.
this is my console error:
so,i go to check the docker-walrus,it gives me this:
There is no minimum limitation of multipart upload on S3 API of minio server.
How can i change the code inside
shared/data_tools_core_s3.py
line 153? @PJEstradaHere is my temporary solution,i comment out the
MultipartUpload
insideshared/data_tools_core_s3.py
line 148-154 related.When i start to run the upload script,minio server connection error occurs.This is werid,i dont know why.Is this about message queue jam issue?Sometimes when i start upload script,the message queue dont handle it.MQ always do the heartbeat check,just miss the real job!
After several tries,finally i get upload worked.I dont do anything except restarting all diffgram docker services.
Saddly there is always a but,here comes an error.
docker hosted diffgram view:
I go to check the minio buckets page,there is no uploaded file.
Does it mean my file uploading fail?I dont know how to edit code inside
shared/data_tools_core_s3.py
,it seems that my temporary solution will lead to unexpected errors.The text was updated successfully, but these errors were encountered: