Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out of memory when uploading large files exceeding 1GB using --data-binary @ #14521

Closed
timyuer opened this issue Aug 13, 2024 · 1 comment
Closed

Comments

@timyuer
Copy link

timyuer commented Aug 13, 2024

I did this

When I upload a large file to HDFS, curl reports a memory overflow,This file has a size of 1.1G.
1723519108243

curl -sS -L -w '%{http_code}' -X PUT --data-binary '@/root/hbase.tar.gz' -H 'Content-Type: application/octet-stream' 'http://192.168.8.101:50070/webhdfs/v1/hdp/apps/3.3.2.0-008/hbase/hbase.tar.gz?op=CREATE&user.name=hdfs&overwrite=True&permission=444'

When I modify the following code

#define MAX_FILE2MEMORY (1024*1024*1024) /* big enough ? */

to

#define MAX_FILE2MEMORY (1024*1024*1024*2) /* big enough ? */

Run successfully after recompilation.

I expected the following

I hope to upload files larger than 1GB.

curl/libcurl version

curl-7.79.1,According to my testing, this issue has been present since version 7.73.0

operating system

openEuler-22.03-LTS-SP4

@bagder
Copy link
Member

bagder commented Aug 13, 2024

Or you upload it in a streaming fashion instead of loading the entire thing into memory before sending and then you can upload however large file you want:


curl -sS -L -w '%{http_code}' -T '/root/hbase.tar.gz' -H 'Content-Type: application/octet-stream' 'http://192.168.8.101:50070/webhdfs/v1/hdp/apps/3.3.2.0-008/hbase/hbase.tar.gz?op=CREATE&user.name=hdfs&overwrite=True&permission=444'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

No branches or pull requests

2 participants