Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unpack and par-check failures on Unraid/Docker #558

Closed
woble opened this issue Jun 20, 2018 · 57 comments
Closed

Unpack and par-check failures on Unraid/Docker #558

woble opened this issue Jun 20, 2018 · 57 comments
Labels

Comments

@woble
Copy link

@woble woble commented Jun 20, 2018

It seems that nzbget recursively unpacks archives that fail because of CRC Failed error. Double checked using 7z to confirm the CRC error. I've been having this issue for quite a while now with some archives, prior and in v20. Because of that the pp-queue piles up, regardless of the configured strategy for post-processing.

To reproduce you would obviously need an archive that returns a CRC Failed error.

Expected behavior - download should be marked as bad.

Running on unraid as binhex-nzbget docker.

@hugbug
Copy link
Member

@hugbug hugbug commented Jun 20, 2018

If possible please send me the nzb to nzbget@gmail.com.

@woble
Copy link
Author

@woble woble commented Jun 21, 2018

I normally remove these downloads as soon as I notice to not confuse other apps with the clogged up queue. I will keep an eye on it and will send one when I experience it again, although I think it might also be related to how fresh the NZBs are and their propagation on various usenet server. Download the same NZB at a later point might result in a normal behavior.

@hugbug
Copy link
Member

@hugbug hugbug commented Jun 21, 2018

I've tried to reproduce the issue with a manually constructed corrupted download. The download completed correctly with failure status.

Maybe you have the log-file at least?

Do you remember if these were rar- or 7z-archives?

@woble
Copy link
Author

@woble woble commented Jun 22, 2018

I suspect it's somehow related to nzbToMedia. I've removed those scripts for now and will report back once I've tested it for a a day or so. As I said earlier, it seems to happen with fresh NZBs (usually in the morning in my case) that fail because they haven't fully propagated yet.

@hugbug
Copy link
Member

@hugbug hugbug commented Jun 22, 2018

that fail because they haven't fully propagated yet.

Downloads with missing articles usually are not unpacked at all if they have par2-files. CRC errors may happen in two cases:

  • download don't have par2-file and nzbget therefore was not able to determine file correctness. In that case archives are unpacked (tried);
  • archive files were damaged before creating par2-files. This is although possible but should be a rare case as such posts will fail for everyone and poster should know of that pretty soon.

If the issue happens again please save the log-file of the download before deleting download.

hugbug added a commit that referenced this issue Jun 22, 2018
@woble
Copy link
Author

@woble woble commented Jun 22, 2018

Happened again on a fairly fresh nzb. Emailed both nzb and log. Tried downloading again, it failed correctly. Very odd.

Edit: downloaded again with success.

@hugbug
Copy link
Member

@hugbug hugbug commented Jun 22, 2018

The log doesn't show a recursive unpack (if with recursive you mean the unpack going on and on).

First it tried to directly unpack during downloading. That failed. Then it tried to unpack after downloading and this is where the log ends. Interesting fact: in both unpack attempts there is no output from unrar process as if the process could not start. This is suspicious.

Can you please send me the global nzbget log? It contains more info which may be useful.

@hugbug hugbug added the support label Jun 22, 2018
@woble
Copy link
Author

@woble woble commented Jun 23, 2018

(if with recursive you mean the unpack going on and on).

Yeah, when I observe the file being unpacked inside the respective folder, I see its file size increase to what looks like its full size, presumably it fails silently and starts from 0 bytes again. Hence why it seems recursive.

@woble
Copy link
Author

@woble woble commented Jun 26, 2018

Keep experiencing the issue even with archives that seem to be OK, but not all the time. There's nothing in the log other than it's unpacking and where it's unpacking. When I look inside _unpack folder I see the file growing in size, then starting from 0 again. Extracting it manually using 7z CLI works just fine.

Can't confirm 100% yet, but restarting the docker seems to solve the problem temporarily until the next time it for some reason gets stuck with some NZB.

Any chance it could somehow be related to DirectUnpack? Or perhaps unrar process?

@hugbug
Copy link
Member

@hugbug hugbug commented Jun 26, 2018

In the log you have sent me it makes exactly two unpack attempts: one attempt during direct unpack. Unrar reports failure, direct unpack cancels. Later, when everything is downloaded another unpack attempt is made during postprocessing. That attempt also fails with unrar reporting an error. No further attempts are made. That's not a recursive unpack or unpack in the loop but rather a normal behaviour.

Do you have downloads for which more unpack attempts are made? If so can you please send me their logs?

@hugbug hugbug changed the title Recursive unpacking on CRC error Unpack and par-check failures on Unraid/Docker Jun 28, 2018
@hugbug
Copy link
Member

@hugbug hugbug commented Jun 28, 2018

The logs exhibit the very similar behaviour as reported by other user on forum in topic I'm getting numerous PAR: FAILURE errors:

  • no errors are reported during downloading;
  • unrar fails to start properly (it hangs) or unrar reports checksum error;
  • first par-check reports files are OK and not repair is necessary; that first par-check is made in quick mode (that's OK) and it relies on file checksums calculated during downloading; the succeeded par-check confirms the files were downloaded correctly;
  • another unpack is made (after par-check) which fails again;
  • second par-check is performed; this time in full mode where all downloaded files are read from disk and their checksums are calculated again. This par-check reports errors in multiple files. That means the downloaded files were not properly written into disk; otherwise their checksums would fail with first quick par-check;
  • par-repair is performed: at the end of repair the repaired files are checked and they don't have correct checksums; the par-repair fails with message "repair completed but the data files still appear to be damaged". Once again that means the repaired files computed during repair were not written properly into disk as subsequent reading of files from disk and recompilation of checksums fails.

What @woble and ChapeL (forum user) have in common? It's Unraid with docker.

  • What media is used for InterDir: is this a software RAID? Can you put InterDir into a simple non-RAID disk (formatted as EXT4 which doesn't use any fancy Unraid drivers or software)?
  • Since when the issue started to happen? Were there updates of Unraid software recently before that?
  • Are there any way to check disks for errors?
@woble
Copy link
Author

@woble woble commented Jun 29, 2018

What media is used for InterDir: is this a software RAID? Can you put InterDir into a simple non-RAID disk (formatted as EXT4 which doesn't use any fancy Unraid drivers or software)?

6 equally sized and branded SSDs in btrfs RAID10

Since when the issue started to happen? Were there updates of Unraid software recently before that?

Difficult to tell, but could be around unRAID 6.5.x or maybe even 6.4.x

Are there any way to check disks for errors?

btrfs scrub can be done, but it's unlikely to be related to that. Will run a scrub to see if there are any corruptions. Edit: ran scrub, it did not report any errors.

I had the issue this morning again (seems to be every morning or early in the day; at least as that's when nzbget is doing most of the work). With a queue being stuck with one NZB not unpacking correctly I restarted nzbget and lo and behold everything unpacked as expected. So could it be that the unrar process somehow hangs or gets stuck in a loop and doesn't communicate properly back to nzbget about it?

@hugbug
Copy link
Member

@hugbug hugbug commented Jul 1, 2018

So could it be that the unrar process somehow hangs or gets stuck in a loop

Yes unrar does stuck sometimes (according to the log) but I don't know if that the unrar issue or the system issue (unrar can't read files)

You use unrar 5.60 beta although nzbget installer package includes unrar 5.50. It seems the docker maintainer put separate unrar into the package instead of using unrar included with nzbget.

For a test change UnrarCmd to ${AppDir}/unrar (default value).

To rule out all docker issues I suggest to install nzbget without docker using installer from nzbget home page (follow installation manual linked on download page). When installed using installer all files are installed into one (application) directory, which can be easily deleted later. It will not affect the system in any way.

@woble
Copy link
Author

@woble woble commented Jul 1, 2018

I've switched to @linuxserver docker for now to see if it experiences the same issue. I am not sure installing nzbget directly on unraid is as straightforward as expected. Alternatively running it inside a VM could be an option.

Perhaps @binhex can be of some help in regards to unrar version in his docker.

@binhex
Copy link

@binhex binhex commented Jul 2, 2018

For Arch Linux unrar is not packaged as part of nzbget, its an optional package, however i did bundle unrar with my base image due to a lot of applications requiring it, and its true that the version installed is a beta (arch linux targets bleeding edge versions). So i have included a specific version of unrar to be installed which should force this up to 5.6 stable, @woble not sure if you are up to testing this again, i understand if you have moved over to LSIO and dont want the hassle of moving back, but i would love the feedback to at least know its now fixed (image just built).

@woble
Copy link
Author

@woble woble commented Jul 2, 2018

@binhex, sure thing. I still have the docker just not running it. I can switch back to your docker for a few days to test whatever needs testing.

Edit: switched back to the latest binhex-nzbget which uses unrar 5.6 stable. Will report back as soon as possible.

@woble
Copy link
Author

@woble woble commented Jul 3, 2018

@binhex
Experienced the same issue again. Restarting the docker made unpacking of the stuck process work however.

Also, last time this happened I checked the processes in the docker and unrar was process was running. I guess something makes it not communicate with nzbget or perhaps the process itself stuck with something. Could be unrar issue, unless we can test 5.6 in another docker to rule that out.

@binhex
Copy link

@binhex binhex commented Jul 3, 2018

ok keep me updated, if you see it in a stuck condition again then please post here, i can roll this back to 5.5 stable if required, sounds a little bit suspect though tbh, i wouldn't of thought unrar 5.6 stable would be so flaky hmmm.

@hugbug
Copy link
Member

@hugbug hugbug commented Jul 3, 2018

I repeat the important part:

  • no errors are reported during downloading and quick par-check (which doesn't read files but uses CRCs calculated during downloading) reports files are OK;
  • an unpack attempt is made;
  • a full par-check (which reads files) reports errors.

That means:

  • either the files are not written properly into disk;
  • or unrar corrupts the files.

I think it's very unlikely that unrar (even beta) can have such a fatal bug.

It would be really best to install nzbget directly and test for a while to rule out possible issue with docker system.

Also (even if continue using docker version) try this:

  • set option ParCheck=always;
  • set option ParQuick=false.

With these settings the full par-check will be performed before unpack attempt and we will see if files were OK before unpack. NOTE: These settings slow down post-processing and are not recommended for normal use.

When the issue occurs again please send me the nzb-log and the full log.

@woble
Copy link
Author

@woble woble commented Jul 3, 2018

either the files are not written properly into disk;

Files appear to be written just fine because if I do a manual unpack using 7z (haven't tried unrar) via command line, the file unpacks just fine.

or unrar corrupts the files.

As I mentioned before, it works the moment I restart the docker; nzbget picks up where it was left, which is unpacking except this time it works as expected.

Will try the settings you mentioned and report back as soon as I can.

It would be really best to install nzbget directly and test for a while to rule out possible issue with docker system.

Will try that as the last thing. Otherwise linuxserver docker seemed to have worked just fine.

@hugbug
Copy link
Member

@hugbug hugbug commented Jul 3, 2018

Files appear to be written just fine because if I do a manual unpack using 7z (haven't tried unrar) via command line, the file unpacks just fine.

Full par-check is made during the process. If there are enough par2-file the files are repaired during full par-check. Unpacking with 7z after that doesn't mean the files were OK at the first stage. The files must be damaged since the full par-check reported errors in multiple files. There is a chance that nzbget (and unrar too) can't read the files properly (and therefore unrar fails and nzbget full par-check reports errors) but 7z can.

Do you run 7z within the same docker container?

@woble
Copy link
Author

@woble woble commented Jul 3, 2018

7z is installed directly inside unraid.

@woble
Copy link
Author

@woble woble commented Jul 4, 2018

LOG emailed.

  • SSH into the docker (binhex one).
  • Ran top to see processes, unrar (5.6) is running for several minutes while unpacking then restarts presumably when done unpacking.
  • Tried using kill to stop the unrar process; it restarts immediately and continues running in a loop.
  • Used unrar x inside docker to unpack; unpacked successfully inside the docker.

Will setup nzbget directly on unraid and run that to see how performs it in the same configuration.

@binhex
Copy link

@binhex binhex commented Jul 4, 2018

@woble thanks for doing more digging into the issue, fyi i will be creating a test tagged docker image later on today that will include unrar 5.5.0, if that seems to 'fix' the issue then perhaps we have a incompatibility with interaction between nzbget and unrar 5.6.0, lets find out more later on.

@hugbug im sure you're a busy person, but any chance you could test latest stable nzbget with unrar 5.6.0?, preferably running on linux using mono to closely simulate the problem?, it doesn't have to be docker, as i dont believe the issue is related to docker.

@woble
Copy link
Author

@woble woble commented Jul 4, 2018

@binhex
Let me know when you have it and I will test it.

@hugbug
Had some problems running nzbget outside docker, mainly permission related between nzbget and other tools running inside docker. I think running it as nobody:users would solve it, but I haven't had enough time to test it yet.

@binhex
Copy link

@binhex binhex commented Jul 4, 2018

@woble it's there now, add tag named 'test' to the end of the repository name in unraid web ui for the docker container config, if you are unsure how to do this look here Q12:-

https://lime-technology.com/forums/topic/44108-support-binhex-general/?tab=comments#comment-433612

@binhex
Copy link

@binhex binhex commented Jul 5, 2018

@woble how you getting on with the test image?, is unpacking more reliable now?

hugbug added a commit that referenced this issue Jul 6, 2018
@woble
Copy link
Author

@woble woble commented Jul 7, 2018

So I am back on the linuxserver.io docker and so far I have had no issues other than occasional actual bad downloads. @binhex, perhaps the bleeding edge nature of arch somehow conflicts with the rest of the configuration (both hardware and software)? 😕 If you need somethings tested, let me know, I can switch between your docker and linuxserver fairly easy.

@hugbug, feel free to close the issue as I feel it's not really related to nzbget directly. We can still continue the discussion here if that's alright.

@PeteBa
Copy link

@PeteBa PeteBa commented Jul 7, 2018

@binhex, same symptoms as @woble and using your latest docker image.

Web UI shows unpack has hung, netdata shows docker continuing to use 40% cpu and in terminal I can see the files written but then reverting to zero length and cycling again. Restart of the docker will complete the unpack fine.

Currently on unRAID 6.5.3. I upgraded from 6.5.1 and to latest docker when I first noticed this issue. Happy to help try and debug if useful. Let me know best forum.

@hugbug
Copy link
Member

@hugbug hugbug commented Jul 7, 2018

@PeteBa

When issue happens again please try this: cancel postprocessing of current nzb using nzbget webui via clicking on nzb item in queue and then using actions menu (button Actions). The processing of nzb should cancel and it goes straight to history without other processing (no par-check etc.). Do not restart docker. After that try to unpack the files in terminal using unrar and make two tests: once using unrar from docker image and then using unrar from nzbget package (near nzbget binary). Post results here.

After that please make another test: set option UnrarCmd to ${AppDir}/unrar to use unrar included with nzbget installer and see how it works.

@PeteBa
Copy link

@PeteBa PeteBa commented Jul 9, 2018

OK. So I had a "stuck at unpacking" download.

I canceled postprocessing from the ui. Without restarting docker went in to console:

  • top shows "unrar" process still running
  • find / -name "unrar" -print locates only one "unrar" executable alongside "nzbget"
  • unrar x *.rar successfully unpacked the file with no errors
  • "unrar" is version 5.6.0

As it happens, I have another download still stuck in the unpacking stage so let me know if anything else I can do to help track this down.

@hugbug
Copy link
Member

@hugbug hugbug commented Jul 10, 2018

I thought binhex docker image uses nzbget installer like the linuxserver image does but I now I see that isn't the case. Binhex image uses nzbget from Arch Linux distribution.

@binhex: I'm not a docker specialist so may be you can tell me. All programs inside docker use the host kernel. When you put a program compiled for Arch Linux, which is known to be a bleeding edge Linux distribution, into docker container which runs on Linux host with much older kernel than Arch, can't this cause troubles?

NOTE: nzbget installer from nzbget home page is statically linked and purposely targets a very old Linux kernel (2.6.something). This binary works on most (all?) Linux systems.

@PeteBa
As an experiment can you test if nzbget and unrar from nzbget installer work any better in the same container? For that you can install nzbget installer into any directory (inside container or host). Then take binaries nzbget and unrar from installation directory and put them into nzbget docker container.

After that you can delete installation directory created by installer. The installer puts all files into this directory and doesn't change any other files on your system. Therefore after removing of installation directory there will no be any traces of nzbget left.

@PeteBa
Copy link

@PeteBa PeteBa commented Jul 13, 2018

So I gave this what I could. I subbed in the nzbget version of unrar into the docker and had the same issues. I then swapped in the stock nzbget executable and didnt see the issue repeated albeit in a limited test period.

I've now gone with linuxserver version to keep it simple and over the last day and a half had no issues.

@binhex
Copy link

@binhex binhex commented Jul 16, 2018

@hugbug reply to your question:-

When you put a program compiled for Arch Linux, which is known to be a bleeding edge Linux distribution, into docker container which runs on Linux host with much older kernel than Arch, can't this cause troubles?

So as far as i know there are seldom links between applications and specific kernel versions, yes you are right arch linux does tend to go for latest stable or even in some cases beta releases, however in practice i have seen (to date) no issues with a older (or in some cases newer) kernel with arch linux running inside a docker container (unraid runs latest stable kernel at time of release).

But yes i guess if there were packages that target particular kernel versions then theoretically it is possible that this might cause problems, but you have to keep in mind docker only runs what you tell it (no systemd), so the number of processes running in very small, thus the potential for any issue is fairly remote (near zero).

im going to take another poke around and see if i can see what the hell is going on, i think for now im fairly confident the issue is not related directly to the version of unrar.

@binhex
Copy link

@binhex binhex commented Jul 17, 2018

OK so ive now generated a new docker image targeting the installer, so it now doesn't compile from source, @woble @PeteBa if you have time please pull down the new 'test' tagged image and let me know how you get on.

@woble
Copy link
Author

@woble woble commented Jul 17, 2018

@binhex will try tomorrow.

@PeteBa
Copy link

@PeteBa PeteBa commented Jul 17, 2018

@binhex , I must be doing something wrong.

From UnRaid docker tab, changed repo to binhex/arch-nzbget:test and applied. Clicked on WebUI but getting a 404 Not Found on http://server:6789/#. Nothing obvious in the logs.

From console, I can see the app is running. Looking at /config/nzbget.conf shows port 6789 is configured. Test connection from Sonarr is successful.

Swapping back to latest tag and webui comes up no problem.

[update]

Having said all of the above, I have used Sonarr to download 3 series of 10 episodes each without any issues. So the original "stuck at unpacking" seems good so far.

[... and some more ...]

So it looks like the latest tag uses WebDir=/usr/share/webui. But this directory doesn't exist in the test tag build. Editing WebDir=/usr/local/bin/nzbget/webui and up pops the webui. Great.

Will keep you posted on whether this sorts the unpacking problem,

@woble
Copy link
Author

@woble woble commented Jul 18, 2018

Fails as before on my end.

@binhex
Copy link

@binhex binhex commented Jul 19, 2018

@woble ok thanks for checking, ok im out of ideas then, i guess for now you either roll back to 19.0 or you use LSIO version, maybe i will have more luck with 21.0 whenever thats released.

@hugbug
Copy link
Member

@hugbug hugbug commented Jul 19, 2018

Fails as before on my end.

How exactly it fails? It seems two different issues were reported:

  1. CRC errors during unpack. Manual unpack works without errors.
  2. Unpack hangs; no unpack progress is seen in nzbget; and:

    when I observe the file being unpacked inside the respective folder, I see its file size increase to what looks like its full size, presumably it fails silently and starts from 0 bytes again.

Which one from these issues do you observe with the new docker which uses nzbget installer?
Can you please post nzb-log for that item (via pastern.com or via email to me). Thanks.

@PeteBa
Copy link

@PeteBa PeteBa commented Jul 19, 2018

@hugbug good challenge to get to the root cause of the problem:

  1. So I never experienced the CRC error issues during unpack so cant comment on that
  2. I was getting unpacking hangs every day or two using the latest tag docker, however,
  • using latest tag with stock unrar from nzbget site still resulted in unrar hangs
  • using latest tag with stock unrar & stock nzbget had no problems albeit for a limited period
  • using test tag which has stock unrar & nzbget has been running without problem for 2 days now

[updated]
So after three days, I found the ui showing an unpack hanging using the test tagged docker. Nothing specific in the logs but unpack frozen. So I guess the problem still exists.

@hugbug
Copy link
Member

@hugbug hugbug commented Jul 26, 2018

@PeteBa
Can you make test with nzbget installed on host (not in docker) using nzbget installer from nzbget home page?

BTW, when you edit a post no notifications are sent and your additions to post go unnoticed.

@binhex
Copy link

@binhex binhex commented Jul 27, 2018

@PeteBa @woble i have refreshed the base docker image and re-based nzbget from it, it's still currently using the nzbget installer to install, please can you guys give it a whirl and see if things stabilise.

NOTE:- This is currently tagged as 'latest' NOT 'test'.

p.s. personal thanks to @hugbug for continuing to help try and debug this, i appreciate it.

@woble
Copy link
Author

@woble woble commented Jul 29, 2018

@binhex
No issues so far, but I'll let it run for a few more days just to be sure.

@binhex
Copy link

@binhex binhex commented Jul 29, 2018

@woble ta

@woble
Copy link
Author

@woble woble commented Aug 1, 2018

@binhex Have been running into the same issue as I did before for the past two days. I'll revert back to LSIO docker for now and if I get the time I will try a fresh install yours.

@hugbug
Copy link
Member

@hugbug hugbug commented Jan 11, 2019

Closing this with the hope the issue is resolved. Feel free to reopen or add more posts if that's not the case and if I can somehow help.

@hugbug hugbug closed this Jan 11, 2019
@CaptainMalarkey
Copy link

@CaptainMalarkey CaptainMalarkey commented Feb 27, 2019

Hi all. I came searching for a solution to this issue as it seems to still be around. I'm running unRAID 6.6.6 and the latest version of the binhex docker.

It happens to me out of the blue a couple of times a week, I go to have a look and there is a queue of files at the 'pp-queued' stage stuck behind a file that is 'unpacking'. All it takes to clear it is a simple restart and it is all systems go until the next time. All the files complete successfully after that.

@woble
Copy link
Author

@woble woble commented Mar 2, 2019

@CaptainMalarkey Consider switching to the linuxserver.io docker. I haven't that issue on that one since I switched.

@CaptainMalarkey
Copy link

@CaptainMalarkey CaptainMalarkey commented Mar 4, 2019

@woble After reading through the thread here I have switched to test how the linuxserver.io version goes. I was very happy that my exported settings from the binhex version loaded up and worked perfectly in the ls.io docker, was not looking forward to setting all that up again!

Mainly posted to let people know that it doesn't seem to be fixed as of February 2019 if they make their way here looking for information like I did.

@joshstrange
Copy link

@joshstrange joshstrange commented Mar 6, 2019

I've been dealing with this issue for a very long time and thought I was going crazy. This reddit post worried me due to a comment about possible failing drives and so I thought this was all behind me when I bought new SSD's around December.

And.... Same issue, I finally was able to work around it by adjusting my settings so that 1 stuck download didn't stop up the entire queue but it still spiked my CPU load and would cause all sorts of hangs on my server. I will switch to Linuxserver's docker as that appears to be the issue but I never would have guessed that so THANK YOU ALL for digging until you found a solution.

I especially want to thank the NZBGet team, your response to this issue was top notch. You could have easily brushed away the original issue because of all the special cases (binhex, docker, unraid) but instead you all went above and beyond. I just wanted to say thank you, not for finding the issue and saving me from pulling my hair out (I mean, thank you for that too) but for how you all responded and for making the open source world better for being a part of it. You all rock!

Edit: If you are like me coming from Binhex's dockers you might run into the same issues I did:

Loading configuration failed
Could not load template configuration file nzbget.conf. The file contains descriptions of options and is needed to display settings page. The file comes with NZBGet distribution archive and is usually automatically installed into the right location. This seems not to be a case with your installation though.

Please edit your configuration file (/config/nzbget.conf) in a text editor and set the option ConfigTemplate to point to the template configuration file nzbget.conf.

I had to fix 2 things:

  1. binhex uses /data while linuxserver uses /downloads
  2. binhex uses source while linuxserver users the installer

Problem 1 is as easy as editing your nzbget.conf file and replacing the MainDir line with MainDir=/downloads and problem 2 can be solved the same way (TL;DR: Set the following 2 keys to these exact values: ConfigTemplate=${AppDir}/webui/nzbget.conf.template
WebDir=${AppDir}/webui)

@rpj2
Copy link

@rpj2 rpj2 commented Mar 22, 2019

@binhex I'm seeing this same issue. I'm using your sonarr docker also. If sonarr initiates the download, then it continually unpacks. If I go to nzbget and add the nzb file manually, it completes and updates sonarr.

@ku8475
Copy link

@ku8475 ku8475 commented Apr 17, 2019

So this was never solved I take it? Binhex just stopped supporting and everyone is switching to linuxserver's version instead? I would rather not but I guess I can. Is there a way to manual update unrar in NZBGet so I can just try to fix it myself?

@binhex
Copy link

@binhex binhex commented Apr 18, 2019

ive not stopped supporting it, i just dont know how to fix it, simple as that really, if anybody has any ideas im all ears.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
9 participants