RSS Sync hung/died?

There’s been a couple of posts lately talking about issues like episodes not getting downloaded, which are, sadly, a little vague on details. I just experienced an issue which would give similar symptoms: the RSS sync started but never finished or ran again until I restarted. So, while I also don’t have as much detail as we’d all like it’s a small step towards figuring out what might be going on.

Everything was working fine:

16-2-1 14:16:42.8|Info|RssSyncService|Starting RSS Sync 16-2-1 14:16:49.3|Info|DownloadDecisionMaker|Processing 1376 releases 16-2-1 14:17:19.3|Info|RssSyncService|RSS Sync Completed. Reports found: 1376, Reports grabbed: 0 16-2-1 14:33:44.0|Info|RssSyncService|Starting RSS Sync 16-2-1 15:59:17.3|Info|SceneMappingService|Updating Scene mappings 16-2-1 16:24:18.3|Info|SceneMappingService|Updating Scene mappings

And that was the last RSS Sync entry until I restarted. Before the restart a manual search just span its wheels (not even returning a lack of results), which is what prompted me to look more closely and restart. (Yes, I cheekily added the sickbeard feed. Shut up.)

16-2-2 13:00:05.0|Info|SceneMappingService|Updating Scene mappings 16-2-2 15:26:15.5|Info|NzbSearchService|Searching 13 indexers for [The X-Files : S10E03] 16-2-2 15:26:16.5|Warn|HttpClient|HTTP Error - Res: [GET] http://lolo.sickbeard.com/api?t=tvsearch&cat=5030,5040&extended=1&offset=0&limit=100&rid=6312&seas on=10&ep=3 : 503.ServiceUnavailable [...html snipped...] 16-2-2 15:26:17.3|Warn|Newznab|Sickbeard feed HTTP request failed: [503:ServiceUnavailable] [GET] at [http://lolo.sickbeard.com/api?t=tvsearch&cat=5030,5040 &extended=1&offset=0&limit=100&rid=6312&season=10&ep=3] 16-2-2 15:33:25.4|Info|LifecycleService|Restart requested. 16-2-2 15:33:27.5|Info|Bootstrap|Starting Sonarr - /Applications/Sonarr.app/Contents/MacOS/NzbDrone.exe - Version 2.0.0.3732 [...] 16-2-2 15:34:06.9|Info|RssSyncService|Starting RSS Sync 16-2-2 15:34:08.2|Warn|KickassTorrents|Indexer KickassTorrents rss sync didn't cover the period between 2/1/2016 8:07:35 PM and 2/2/2016 9:36:56 AM UTC. Search may be required. 16-2-2 15:34:08.2|Warn|Wombles|Indexer Wombles rss sync didn't cover the period between 2/1/2016 8:04:19 PM and 2/2/2016 8:54:54 AM UTC. Search may be required. 16-2-2 15:34:08.7|Warn|Rarbg|Indexer Rarbg rss sync didn't cover the period between 2/1/2016 8:33:02 PM and 2/1/2016 11:15:10 PM UTC. Search may be required. 16-2-2 15:34:10.2|Warn|TorrentRssIndexer|Indexer Torrentz Verified rss sync didn't cover the period between 2/1/2016 7:45:07 PM and 2/2/2016 2:10:01 AM UTC. Search may be required. 16-2-2 15:34:10.5|Warn|TorrentRssIndexer|Indexer EZTV.ag rss sync didn't cover the period between 2/1/2016 6:39:23 PM and 2/2/2016 1:49:10 AM UTC. Search may be required. 16-2-2 15:34:10.7|Warn|Newznab|Indexer Nzb.su rss sync didn't cover the period between 2/1/2016 8:08:24 PM and 2/2/2016 5:24:53 PM UTC. Search may be required. 16-2-2 15:34:37.2|Info|DownloadDecisionMaker|Processing 4471 releases

I’m UTC -0600 so the start of the missing period lines up, but I notice that not all indexers are mentioned as having that gap?

Sonarr Version 2.0.0.3732
Mono JIT compiler version 4.2.1 (explicit/6dd2d0d Fri Nov 6 12:25:19 EST 2015)
Running on OS X 10.7.5

To figure if Sonarr was processing anything, you’d need to take a look at ((trace logs)) or ((debug logs)) to see if it processed anything/if it got hung up on a particular item.

Different indexers have different paging limits, KAT only allows 25 items, whereas Newznab indexers allow 100 items and Sonarr will only page back so far (10 pages I believe).

Understood. I was only logging info at the time; debug logging is switched on now, in case this occurs again.

Sure… but I have 6box, althub, dognzb, drunkenslug, nzbcat, nzbgeek and usenet-crawler not mentioned, along with torrentRSS feeds: showrss and demonoid. I’m wondering if this helps narrow anything down. Say, the ones with gaps were checked and then the next one hung preventing the rest? Are they checked in a specific order? Or just downloaded in parallel?

I know there’s not a lot of information to go on here, mostly, this is to confirm that there is a problem that can crop up, even if it’s rare set of circumstances. ¯\_(ツ)_/¯

Those are all newznab based, its highly probably that they were able to catch up a quick check of a different indexer shows almost 24 hours of history.

No idea how far back these ones page.

This doesn’t really tell us anything, just some were able to catch up almost 24 hours. They are queried in parallel.

Which version of mono do you have installed?

Ah, I was thinking of those messages as the state before downloading, not after. Seems odd that .su doesn’t appear to page, but whatever.

Sonarr Version 2.0.0.3732
Mono JIT compiler version 4.2.1 (explicit/6dd2d0d Fri Nov 6 12:25:19 EST 2015)
Running on OS X 10.7.5

Its been happening to me also,

A restart of sonarr seems to fix the problem until it happens again

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.