Sonarr not automatically downloading new episodes

Sonarr version (exact version): 2.0.0.4949
Mono version (if Sonarr is not running on Windows): Using Windows
OS: Windows 10 Pro, Version 1709, Build 16299.19
Debug logs: https://pastebin.com/raw/5SK7SnkV
Description of issue: New episodes not downloading even after several days or even weeks. If I manually search I can find them.

This is also happening with me right now. If I go into the wanted tab, check everything, and Search Selected, it finds them and downloads but isn’t doing so automatically.

I’m also getting “Indexers unavailable due to failures: Rarbg, EZTV” under System Health. Probably something to do with it.

Logs have expired (unless they were debug logs of an RSS sync where a wanted episode is rejected they wouldn’t say much thouh).

How Sonarr finds episodes to download is covered in the ((FAQ)). Is RSS running, finding results and processing them?
Are the episodes on the calendar? If so, what colour are they?

Definitely. Why are they failing (check the logs).

I must have accidentally set the paste to expire in 1 day instead of 1 week.

I checked just now and there were episodes for 10 different shows that should have downloaded but didn’t and a manual check confirmed that the episodes were available. I also forced the RSS search but it didn’t find the episodes and here’s the output:

RssSyncService	RSS Sync Completed. Reports found: 100, Reports grabbed: 0	11:40pm
DownloadDecisionMaker	Processing 100 releases	11:40pm
RssSyncService	Starting RSS Sync	11:40pm

All those 100 episodes were listed as “[Permanent] Unknown Series” and none of the shows I have setup were found at all which seems odd.

I should note that I’m using NZBHydra just in case that matters. But up to a few weeks ago everything was fine so not sure why it would be related to NZBHydra as I haven’t changed any settings anywhere.

Anything you need from me to help diagnose this?

Thank you.

RSS won’t find anything unless they were recently posted and in RSS feed, typically they’re there for a few hours, depending on how many items are being posted.

That’s completely normal, the RSS feed is the latest 100 releases, which could be any TV show.

It could, only 100 results for an RSS Sync seems low unless there is only one indexer still working in NZBHyrda.

Debug logs of an RSS Sync where a release you expect to be grabbed is included. Going to take some combing through the logs to find that I’m sure. But checking a few hours after a release aired should have it in a recent log file.

I have quite a few indexers listed in NZBHyrda with a mix of free and paid and more than 10 are not showing any sort of errors so I guess I’m ok as far as that goes.

Here’s a log with Big Bang Theory included. It looks like it’s not finding an SD version which is why nothing is downloaded. However if I run a manual scan I find the SD version.

I went through a few logs and in an earlier one there was the same thing for Supergirl, rejected as there was no SD but a manual search found the SD version.

https://hastebin.com/xivivucaku.sql

What is Sonarr’s RSS Sync interval set to?
Is Sonarr running 24/7 or does that system go to sleep?
Do you have any warning in the logs similar to 17-10-24 15:43:43.0|Warn|Newznab|Indexer Usenet Crawler rss sync didn't cover the period between 16/07/2017 01:42:24 and 16/07/2017 14:02:26 UTC. Search may be required.?

30 minutes

Yes, the machine sleeps overnight.

No, searched all the log files (which cover about 1.5 days) and didn’t find anything similar to that. Seems like an entry like that would make sense though seeing as the machine sleeps overnight, no?

Also, the RSS service always finds 100 reports, never more, never less. Is that normal?

AFAIK sonarr only requests the most recent 100, the default first page of results, when it does an RSS sync unless it determines that it has not done a sync for a period (can’t remember how long), in which case it asks for additional items.

I came across instances where very rarely a flood of releases would result in a release I wanted being pushed off the most recent 100 even when performing a sync every 15 minutes. The indexer was nzbplanet.

I discussed it on their forum to see if the site op could increase the default (sonarr makes a default request) to say the most recent 300 but nothing ever came of it so I put together a perl script running under lighttpd and pointed sonarr at it for nzbplanet requests. The script determines if sonarr is doing a regular RSS sync and if so, it performs additional requests to fetch the most recent 1000 releases, compiles them into a single RSS stream and passes that back to sonarr.

While I was working out what was going on I set up a cron job that would periodically trigger sonarr to do a missing items search.

usenet crawler is just too flakely for me so I have it only being used for searches, not RSS.

I think the fact that Sonarr is offline overnight and NZBHydra doesn’t seem to page back through results when they are missed in an RSS sync is the issue. Adding a couple indexers to Sonarr directly will probably help.

Sonarr explicitly requests 100 results and Sonarr will automatically page back if there is a gap in the RSS feed (it didn’t always do that). 1000 results for RSS is overkill unless Sonarr has been offline for an extended period of time. With a 15 minute RSS sync interval I don’t think I’ve seen 100 not be enough (some indexers that limited it to 25 would frequently hit the limit though).

Well… since setting up to pull the latest 1000 releases hourly instead of the default every 15 mins, I’ve not come across an instance of a release not being picked up from the RSS feed from nzbplanet. Make of that what you will.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.