Sonarr handled failed downloads for newsgroups well, however it’s not implemented for torrents.
I often have results for a file with lots of different torrents. Even after setting the min seeds.
Mostly a few work, but many are not complete and stall during download.
This means I need to manually go into Sonarr and tag them as blacklist and research. For some file this could be 20+ times before it finds one it can completely download in reasonable time.
What I tried:
I understand torrents could come online at later times, but for me if a torrent hasn’t been seen complete ever after 1 hours or so, it’s usually never ending.
I have been trying:
Wrote a script in deluge to identify these torrents and remove them.
Sonarr notices the torrent is removed (removed it from queue), but doesn’t add it to the black-list nor takes the next one from the list (like with nzb’s)
I also wrote a small test script so when the torrentclient deletes it, a signal gets triggered to sonarr to re-search. This works, however:
- Sonarr usually picks up the same torrent (as it wasn’t blacklisted)
- Ideally a new search wouldn’t be needed, as search results from the previous one were available.
To overcome this manual work (and scripts), would it be an idea to have an option that states: ‘When torrents are removed from the client and Sonarr can’t find them’ put them on a blacklist and research for the next option?