New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stalled torrent detection #958

Open
ghost opened this Issue Nov 24, 2015 · 19 comments

Comments

Projects
None yet
@ghost

ghost commented Nov 24, 2015

Hi all
I understand that there is a rule that the torrent have to have atleast 1 seed to fetch the torrent.

But often i get that is no seeds at all when i look it up in Transmission.

So i suggest that we can have a setting where we choose how many seeds we want.

And since that is not possible to see on all trackers we should also have a rule that if nothings happens in like 15-30 minutes the torrent are deleted and blacklisted and a new one will be downloaded.

Thanks in advance and thanks for a great application.

@kayone

This comment has been minimized.

Member

kayone commented Nov 26, 2015

I think there are two separate issues here,

  1. add support to configure minimum number of seeds, very reasonable, the question is what should be the scope? I want to say global setting. but maybe there are reasons to do it per tracker?
  2. torrent failure cases. when should it be considered failed? slow speed? over what period, etc. this one is more complicated both from requirement perspective and implementation.
@ghost

This comment has been minimized.

ghost commented Nov 27, 2015

I would say global is ok per tracker is overkill I guess.

The 2:nd one I understand are the most complex one but that would also give the most to the users.
I would say just a setting where I set X minutes and if the torrent is not done delete and block it cus if I set it on 60 minutes I won't get it at all if I wait longer. I don't need the episode that second so I can wait for a while before it terminates the torrent.

That's my thoughts on the matter I not to picky I just want some functionality that brings me close to no manual work at all :)
And I would say that torrent is the only thing that makes me do manual work right now :)

@markus101

This comment has been minimized.

Member

markus101 commented Nov 30, 2015

I would say just a setting where I set X minutes and if the torrent is not done delete and block it cus if I set it on 60 minutes I won't get it at all if I wait longer.

That might work for a seedbox with very fast download speeds, but a user shouldn't have to guess how long it will take for a download complete, what if its a 2160p release of a full season vs a 480p single episode, those wouldn't have the same download time.

One consideration is the internet being down, if Sonarr and the client and communicate, but the client can't connect to the internet Sonarr can't start blacklisting every incomplete torrent.

There will almost always be some level of manual work required with Sonarr, especially when it comes to complex issues such as this.

@markus101 markus101 changed the title from Torrent, remove and blacklist torrent without seeds to Stalled torrent detection Nov 30, 2015

@markus101 markus101 added the proposal label Nov 30, 2015

@ghost

This comment has been minimized.

ghost commented Nov 30, 2015

Yeah I see the problems just wishing we had something to help us make it more automatic
:)

@tristaoeast

This comment has been minimized.

tristaoeast commented Mar 1, 2016

Sorry for "re-opening" this thread, but I believe that at least the solution proposed for problem 1 would be a very reasonable way of avoiding stalled torrents, and like @spyvingen said, I also believe that per tracker might be a bit of an overkill, but that also depends on how it would impact the overall performance of Sonarr and how costly one and the other solution would be to implement.

Regarding issue 2, I believe that a better workaround would be the possibility to select how Automatic Search prioritizes the results it gets from the trackers. I know someone suggested this in another thread, but what they were suggesting was to make it possible to configure it in a fine-grained manner, which could be too complex and produce undesired results. Instead I would suggest creating three or four different priority "profiles" fo the torrents, similarly to what you have done with the qualities, from which the user could select the one better suiting its needs. Right now I'ḿ having a problem with older seasons, where, within the same quality (an episode of Marvel's Agent Carter from Season 1 at 720p is an example I had), the torrent with the lowest number of seeds (just one seed which eventually disappears and the download stalls) is getting selected, where there are other torrentz with the same quality, equally acceptable by the profile and a lot more peers that are not being downloaded. So a profile suggestion would be, for example, to sort by number of seeds/peers within the same quality, which isn't happening from what I'm seeing when I use Manual Search and select to order the results according to Auto-Search criteria.

Sorry for the long post, and thanks in advance for your time. Please let me know what do you think ;)

@markus101

This comment has been minimized.

Member

markus101 commented Mar 1, 2016

@tristaoeast thats really more about better prioritization when choosing results, not detecting stalled torrents and is something we're working on.

@tristaoeast

This comment has been minimized.

tristaoeast commented Mar 2, 2016

@markus101 I know it's not really about detecting stalled torrents, I was thinking of it more like a workaround. But it's great to hear that you are working on improving prioritization :) Can I help with that in any way? Is there a discussion anywhere on how you are planning to improve it? :)

Keep up the execllent work :D

@ntcong

This comment has been minimized.

ntcong commented Apr 4, 2016

Is there any updates on the improvement? I think a global configured minimum seed required is enough, or per indexer. Per tracker is overkill imho.

@rux616

This comment has been minimized.

rux616 commented Oct 12, 2016

Something needs to be done about this issue. I am finding that Sonarr is not grabbing episodes because the RSS feed (eztv.ag in this case) has an issue and populates the seeds field with 0's.

At the very least, there should be an option to turn this feature off with a warning triangle next to it saying something about possibly causing stalled downloads, etc. Additionally, items with 0 seeds should still be last in priority, that is, used only if nothing else is available.

@kiwixz

This comment has been minimized.

kiwixz commented Mar 22, 2017

Yes we really need an option to ignore seeders count !

@limetime

This comment has been minimized.

limetime commented Apr 3, 2017

Hi just wanted to add some support to this thread.

My thoughts on improving this would be to sort of work with Sonarr's way of working rather than to create more on top of it. By this I mean:

  1. Through my profile settings I can already choose both Torrent and Usenet, I add a show and search.
  2. If a torrent gets picked up but it's really slow then let me have a parameter that globally forces another search after certain amount of elapsed time. This would then obey the same rules as normal, in my case preferring nzb and then torrents as a backup (with more seeds than the currently downloading file). If there isn't anything new to pickup then carry on, if there is something better out there, give it a go.

This would be a good way of letting me sure that the files I'm downloading after the best available rather than having to manually search over and over again when I notice that a torrent has stalled.

Hope that makes sense and thanks for listening :)

@SonarrBot SonarrBot referenced this issue Jun 11, 2017

Closed

Torrent Support - Part 2 #468

10 of 18 tasks complete
@bengalih

This comment has been minimized.

bengalih commented Sep 3, 2017

If this is still being considered I would vote for speed check or some type of calculation based on time remaining of the torrent vs download size. This would probably have to be salted with a bandwidth number provided by the user (or computed via an under the hood speed test).

So an 8GB movie on a 100mbit connection should take what...10-12 minutes assuming all b/w available?
So a tolerance value would need to be provided by the user as to how long they are willing to wait. Not sure how to best indicate this value e.g.: it would start at 100% (since it is silly to expect it to take quicker than my b/w allows). So if I specify 1000% (or maybe just call it a Tolerance of 10) I would be willing to wait 100-120 minutes for that torrent to download.

It could get quite a bit more complicated with controlling this, and I think there may need to be different profiles as well. For instance most users I expect simply want their new episodes as quickly as possible whereas others who want to archive may not care that it takes 2 weeks to download an entire season in 1080p+.

There is a can of worms to open here because there might be network issues/congestion due to many things so this might have to be measured over a range of time. I have two other suggestions that offer a similar functionality through a different approach.

  1. Set a global tolerance for downloads of each quality. This can be based on the already existing profile sizes and a bandwidth the user provides (this allows tweaking from both ends then). If a download isn't processed within this time (or this time plus a configurable multiplier), start a new download of the next option. Whichever one completes first, cancel/delete the other one.

  2. Add a better visual indicator to the Activity queue based on most of the data already there. Some sort of exclamation point that clearly says "Hey this thing is taking too long for what it should" and allow a one-click option to either cancel/blacklist and go with the next in line or to continue on with the behavior I mention in #1 (start a second and see who wins).

@mccorkled

This comment has been minimized.

mccorkled commented Feb 8, 2018

Any news on the implementation of this?

@D4rkSl4ve

This comment has been minimized.

D4rkSl4ve commented Jun 21, 2018

This would be an awesome way to less maintenance. For instance, I am having to look into my Deluge every several days as there are many torrents just sitting there at 0kb or just stalled/non-finished torrents. So, I have to login to the @rr software, go to Activity/Queue, click the 'X' and it removes it from Deluge, and quickly searches for a new torrent. If there was a global where I can say: if Activity/Queue is at 0kb for longer than 'X' time, remove, blacklist, and search again. Or a torrent at 0 speed download for "X' time, remove, blacklist, and search next torrent. A suggestion that would minimize the maintenance on having to check the downloader; which ever it is, as most used have API that can supply such info back, just as the @rr softwares read them to update the Activity/Queue screen.

Pretty please...

@CoderKiwi

This comment has been minimized.

CoderKiwi commented Jul 9, 2018

At a bare minimum, it may be nice to have a sort of timeout feature. So if the torrent isn't downloaded in say a week or a few days of it being added then remove it and blacklist it and search for another.

That circumvents the "what if Sonarr isn't connected to the internet and it starts blacklisting everything" issue. And it solves the edge case of torrents that are just never completed.

@untoreh

This comment has been minimized.

untoreh commented Nov 7, 2018

using sonarr api it is pretty easy, not much logic here, maybe a strikes count >3 to push the deletion.
Timeout seems kind of meh to me...if a torrent doesn't have seeds what are the chances of it getting reseeded? you can wait forever...

#!/bin/bash

apikey=4f3a98b538594bd8a58f162fb4cce2b5
host=localhost:8989
sleep=600
blacklist=true

while :; do
    queue=$(curl -s -X GET "$host/api/queue?apikey=$apikey" | jq -r '.[] | "\(.id) \(.status)"')
    warning=$(grep "Warning" <<< "$queue")

    IFS=$'\n'
    for t in $warning; do
        id=${t/ *}
        curl -s -X DELETE "$host/api/queue/$id?apikey=$apikey&blacklist=$blacklist"
    done
    sleep $sleep
done
@rcdailey

This comment has been minimized.

rcdailey commented Nov 29, 2018

How many more years must we wait?

@D4rkSl4ve

This comment has been minimized.

D4rkSl4ve commented Nov 29, 2018

How many more years must we wait?

@markus101 v3 alpha has this?

@markus101

This comment has been minimized.

Member

markus101 commented Nov 29, 2018

No, if it did this issue would be closed and in the v3 milestone.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment