Join GitHub today
GitHub is home to over 31 million developers working together to host and review code, manage projects, and build software together.
Sign upSpeed drops for large queues? #591
Comments
danjames92
changed the title
[Feature Request] Speed drops for large queues
[Feature Request] Avoid speed drops for large queues?
Dec 30, 2018
hugbug
changed the title
[Feature Request] Avoid speed drops for large queues?
Speed drops for large queues?
Dec 31, 2018
This comment has been minimized.
This comment has been minimized.
I'm impressed it works at all :) Just to be sure it's caused by queue size - have you tested with a small queue? You can make a backup of QueueDir when nzbget isn't running. |
This comment has been minimized.
This comment has been minimized.
Haha, new heights here! Yes I have tested with a small queue. Under 500 items in the queue and it seems to fly by with anywhere from 80MB - 100MB achieveable. |
This comment has been minimized.
This comment has been minimized.
@hugbug Happy New Year! I did notice that I have both a If I copied one of these elsewhere would it lower the amount of items in the queue potentially making it faster? |
This comment has been minimized.
This comment has been minimized.
I've made some changes which improved the speed in my tests. |
added a commit
that referenced
this issue
Jan 12, 2019
added a commit
that referenced
this issue
Jan 12, 2019
added a commit
that referenced
this issue
Jan 12, 2019
added a commit
that referenced
this issue
Jan 12, 2019
added a commit
to fedux/nzbget
that referenced
this issue
Jan 13, 2019
added a commit
to fedux/nzbget
that referenced
this issue
Jan 13, 2019
added a commit
to fedux/nzbget
that referenced
this issue
Jan 13, 2019
added a commit
to fedux/nzbget
that referenced
this issue
Jan 13, 2019
added a commit
that referenced
this issue
Jan 14, 2019
added a commit
to fedux/nzbget
that referenced
this issue
Jan 15, 2019
hugbug
added this to the v21 milestone
Jan 18, 2019
hugbug
added
the
improvement
label
Jan 18, 2019
This comment has been minimized.
This comment has been minimized.
I did tests with queue consisting of over 6000 items, mainly videos, total size 39TB. Performance before and after improvements:
For comparison: speed with small queue (1 item): 400 MB/s. NOTE: the performance tests were made with option SkipWrite=yes. Downloaded data were not written into disk! This helps to find bottlenecks not related to writing of downloaded data. All changes were addressing saving queue into disk. The faster the speed the more often queue is saved into disk (after every downloaded rar-file). The larger queue the more data is written. That has been optimised in several ways and brought big improvements. Real use caseAnother user has also reported issue with decreased download speed on large queue. He uses a low-power ARM based device, QueueDir is located on USB stick. With queue consisting of 1000 items the speed was 0.5MB/s; normal speed (small queue) is 7 MB/s. After improvements the speed is 7-9 MB/s with large queue. @danjames92, do you want to test the new version? |
This comment has been minimized.
This comment has been minimized.
Sorry for not replying sooner. Thanks for these improvements. Sounds great! I am a bit of a noob when it comes to doing this sort of testing but I will try if it's possible as a part of my current setup.. I'm currently using NZBGet in the Suitarr docker container which I think (not 100% yet) I can pass a URL link to download nzbget (and the test release?) I'm using Ubuntu 18.04 with a E3-1245v6. |
This comment has been minimized.
This comment has been minimized.
@danjames92: send me a note to nzbget@gmail.com and I'll provide you with development version of nzbget installer for Linux. I don't know about that docker container and how nzbget is packaged there. Many docker containers use NZBGet installer. If that's the case with your container too then I think you can update nzbget there if you login into container terminal and run the new installer there. |
This comment has been minimized.
This comment has been minimized.
@hotio any idea if what hugbug is saying here is possible inside of suitarr? |
This comment has been minimized.
This comment has been minimized.
itouch5000
commented
Jan 19, 2019
I think you can use this "Installing a different version" option. |
This comment has been minimized.
This comment has been minimized.
I'm using suitarr from hotio directly, not djzeratul's fork. If hugbug could provide a link ala the testing releases with a script like the below example, i could run a test release with the parameter used below with ease.
I am very much a noob in regards to doing any of this so might have to leave it to another to test if I can't work it out. |
This comment has been minimized.
This comment has been minimized.
Here is the installer if you want to try it: (nzbget-21.0-testing-r2277-bin-linux.run): https://drive.google.com/file/d/1PE3OQ_sENXHqgui873FnpdkSsYMQ9Qhh/view?usp=sharing I don't think you can use this URL in docker directly as it opens a sharing page where you need to click the download button first. Feel free to upload the file somewhere else where you can obtain direct download links. |
This comment has been minimized.
This comment has been minimized.
Thanks for that hugbug. Got it installed, just grabbing some new files to see if I can spot the improvements. Will let you know in the coming days. Thank you!!! |
This comment has been minimized.
This comment has been minimized.
Assuming it works great now and closing the issue. |
danjames92 commentedDec 30, 2018
•
edited
I currently find myself with 16,000 items in my queue. With a 1Gb connection, I find myself only reaching peaks of 30MB/s max (10-15MB/s sustained usually) and I believe this is due to my ridiculous queue size.
I already satisfy the speed requirements with 2 x 500GB RAID 0 NVME drives to max my line speed.
Running the latest testing release 21.0-testing-r2220.
I've seen there were some improvements made in #438
Can any further improvements in this area be made so I don't struggle to get out of this hole without deleting stuff from the queue? :)
PS. thanks for this awesome software, it is a joy to use.