Join GitHub today
GitHub is home to over 50 million developers working together to host and review code, manage projects, and build software together.
Sign upGitHub is where the world builds software
Millions of developers and companies build, ship, and maintain their software on GitHub — the largest and most advanced development platform in the world.
Best chunk-size dynamics/management method? #23416
Comments
|
really not seeing the duplicate. |
Checklist
Question
As of right now, I have a bash script with http-chunk-size as a variable, so I can just do
script.sh download.url 300k
and if that is too slow, or if i can go faster with a larger chunk size, i will just terminate the script and retry it changing the chunksize variable. after a few tries, i settle on a sweet spot.
there is no science to this, and the best chunk size is always dependent on multiple factors like which server i am downloading from, time of day, etc etc.
i was thinking of going full blown rube goldberg machine with a script that downloads with a chunk size for a few seconds and quits, logs the time/speed of the small bit downloaded, pipes that into a variable, then does it again with a different size.. and after i have two different test runs with different chunk sizes, pipe that into an equation to find the average and come up with a best guess for the best chunk size off of that..
but wanted to check here before i do all that crap. thanks.