The logic for finding episodes

ver 2.0.0.5338:
Windows 10:

I was wondering if someone could help me understand the LOGIC (decision-making process) on how Sonarr goes out to find healthy/active links to episodes.

I’m using several torrents indexers w/Jacket.

The other day, I added a new TV series to Sonarr all the seasons and episodes were listed, but sonar seemingly never went out to find any of the new episodes; and I chose the selection to monitor ALL.

After giving Sonarr two days to go out and possibly find the episodes automatically, I gave in. Using DuckieTV, I was able to do manual searches for complete seasons and individual episodes to download; sonar never found anything. DuckieTV uses the same indexers as Sonarr. Seemingly never handed off any downloads to the download manager.

I’m just curious to know if the logic that Sonarr is using is flawed or do I need to use a different selection when monitoring episodes; ALL vs. (say) MISSING.

ALSO, I’ve been noticing that Sonarr is handing off downloads to my download manager that immediately go to a STALL status. Again, if I go to DuckieTV and do a search manually I can find a better download resource than Sonarr can. It just seems odd that I’m getting better results doing manual searches than using Sonarr.

Thanks in advance…

Unless you tell Sonarr to search when you add the series, it’s just going to monitor RSS feeds for releases, this is covered in the FAQ.

This can happen if your indexer reports that there are more seeders than actually exist. By default Sonarr requires at least one seeder, but you can increase this.

thats because youre human and can work out whats crap and what isnt. sonarr just picks the highest quality, highest preferred word score, smallest file size, most seeders (and possibly some others)

most public sites are pretty patheitc at updating the seeder/leecher numbers so you typically have to manually search and add all the other variants in the hope that at least one will come down

do a manual search in sonarr - the order the results come back in is the order sonarr would download them so you can see the results of its logic/code there

it might be a nice feature request if you could specifiy a stall time (presuming the clients have something that can be used for that in their api) after which sonarr will blacklist and (optionally) remove the job, then try to download the next variant

1 Like

Only one disagreement… “might” >> “would”

Thanks for all the feedback.
I appreciate it…