Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory issues on large uploads #2

Closed
etherwvlf opened this issue May 26, 2022 · 14 comments
Closed

Memory issues on large uploads #2

etherwvlf opened this issue May 26, 2022 · 14 comments

Comments

@etherwvlf
Copy link

etherwvlf commented May 26, 2022

I am experiencing strange issues where memory load becomes monstrous when uploading large files.
It starts normal and slowly but steadily process memory grows to explicit amounts.

This for example is the result when uploading 10GB file, around 80-90% of upload completion:
DeepinScreenshot_select-area_20220524233104

Sometimes memory load hovers around 8-10GB sometimes it goes all the way up to:
DeepinScreenshot_select-area_20220526213225

I am using this config for testing, but I don't think it is related to an error in the config, as everything works fine and dandy for small files.

{
	http_port 8888
	admin off
	order upload before file_server
}

:8888 {
	root .

	encode zstd gzip

	file_server /upload/* {
		browse upload.htm
	}

	file_server /* browse

	@mypost method POST
	upload @mypost {
		dest_dir upload
		max_filesize 15G
		response_template upload-resp.txt
	}

	log {
		output file access.log
	}
}

As long as you use something big and upload it continuously for enough time this can be reproduced, as I have tried with different size and types of files. Doesn't matter if I use my JS upload form or simply POST-ing with curl the result is the same.
After completing upload, the process retains this memory loaded state for some time and returns to around 400-900MB memory usage after a while which also seems rather much to me for an idle process.

@git001
Copy link
Owner

git001 commented May 27, 2022

Yep, this is because of ReadAll call, should be fixed with this commit aa4681e
Please can you use the latest image if you use docker hub images.

@etherwvlf
Copy link
Author

I don't use docker, but I will build right away with xcaddy and make some tests.

@etherwvlf
Copy link
Author

I am testing with the new build but unfortunately the issue still persists.
This is from uploading 10G file at around 70 percent complete:
70perc

@git001 git001 closed this as completed in df18b5e May 27, 2022
@git001
Copy link
Owner

git001 commented May 27, 2022

Please test the new version 0.4 and set the parameter max_form_buffer to limit the memory usage.

@git001 git001 reopened this May 27, 2022
@etherwvlf
Copy link
Author

etherwvlf commented May 27, 2022

Sorry, I couldn't respond promptly as I had some business to attend to.
It would seem like this is still an issue:
DeepinScreenshot_select-area_20220527070254

However this time it builds up to around 5-6GB, stays there for some time and then memory usage drops to 200-300MB throughout the rest of the upload process.
I am using max_form_buffer set to 1GB and the same config as my original post. I have tried different values to no avail.
Maybe it doesn't honor the max_form_buffer parameter ?

@git001
Copy link
Owner

git001 commented May 27, 2022

Well it looks like deep in side the go modules are some more copy parts

https://cs.opensource.google/go/go/+/refs/tags/go1.18.2:src/net/http/request.go;l=1299;bpv=0;bpt=1

ParseMultipartForm
  -> ParseForm
    -> copyValues

This parts have no limits as far as I see.

However this time it builds up to around 5-6GB, stays there for some time and then memory usage drops to 200-300MB throughout the rest of the upload process.

With this statement you mean that only when the upload starts is that high amount of memory used and after some time not more?

@etherwvlf
Copy link
Author

I mean when the upload is started memory usage slowly and steadily goes all the way up to 5-6GB, retains this level for a time and then suddenly drops to 200-300MB which sustains for the rest of the upload.

@etherwvlf
Copy link
Author

etherwvlf commented May 27, 2022

I found something interesting -
If I set max_form_buffer to 100MB, memory usage climbs to around 700-800MB, retains this level for some time and then drops to 60-70MB. Could it be that the size of this buffer parameter is incorrectly interpreted ?
I will now test it with max_form_buffer 50MB to see what happens.

@etherwvlf
Copy link
Author

etherwvlf commented May 27, 2022

Yep, this must be it, it climbs to:
DeepinScreenshot_select-area_20220527070254
And after a while:
second
Quite better I must say.

@git001
Copy link
Owner

git001 commented May 27, 2022

Well observation, thank you for your detailed information and that you use the module 😄
Looks like the max_form_buffer is multiplied by 7-8 Times.

Please tell me when we can close this issue with a proper outcome.
I will then add the outcome to the https://github.com/git001/caddyv2-upload#background-informations

@etherwvlf
Copy link
Author

As far as I am concerned it can be closed now.
Thank you for writing this module as it really has been a life saver and much needed addition to the plugin kingdom.

@git001
Copy link
Owner

git001 commented May 28, 2022

@etherwvlf I have created https://github.com/git001/golang-multiparttest to verify the upload behavior. looks like the Memory usage is quite normal.

@etherwvlf
Copy link
Author

etherwvlf commented May 29, 2022

I haven't yet used your tool but here are some more tests on my part...
I have set the buffer at max_form_buffer = 100MB

When uploading 3GB single file:
single_file
When uploading 2x3GB files:
2_files
When uploading 3x3GB files:
3_files
When uploading 4x3GB files:
4_files
When uploading 5x3GB files:
5_files
When uploading 6x3GB files:
6_files

It seems to be directly linked to the number of uploaded files at one time A.K.A. multipart upload.
But no matter how high the initial memory load is, after a while it goes back to normal so all is fine by me:
DeepinScreenshot_select-area_20220529041929

@git001
Copy link
Owner

git001 commented Jun 2, 2024

This issue could be solved with this commit golang/go#58363

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants