Sonarr not grabbing new episodes

**Sonarr version (2.0.0.5163)
**Mono version (5.4.1.6)
**Libreelec

Hi All,

New User having a few issues with Sonarr having come over from Sickrage. I understand that Sonarr is using RSS feeds and will only grab new episodes announced on the indexer. Issue is I have set the profile on the my shows to ANY and i have set the minimum and maximum size limits on my quality definitions. When I manually start an RSS scan I can see the client is searching the indexer and finder 20 matches for a particular show but then does not grab and then continues to scan ? When I do a manual check I can see a lot of available files and there is not a exclamation to say it hasn’t met the profile criteria so why isn’t it pulling a file down?

What am I missing here?

Many Thanks in advance.

JaK

You’ll need to look at debug logs of RSS Syncs that have releases you expect to be grabbed, but aren’t. Enable debug logging, let it run for until something wanted shows up and see what the issue was. Every release will log why it whether it was accepted (Release Accepted) or rejected (Release Rejected + the reason(s) it was rejected).

Thanks for the reply Markus, I didn’t realise there was a debug option so I was just interrogating the standard log which wasn’t giving me any reason for not grabbing. I’ll check again tomorrow when it should attempt another grab.

I’ll update once I have some more info.

Hi Markus\All

I’ve managed to grab some logs this morning and it appears that the RSS feed that gets pulled down doesn’t seem to have the latest nzb’s that are available under the category I am searching (TV HD 5040). If I check the site I can see the shows I want to grab are there but the RSS feed doesn’t pick them up?

See log https://pastebin.com/YRc3HTbY

Any help would be greatly appreciated.

Joe

Did you check previous log files as well? RSS is only going to capture the last 100 items.

Are you using a VIP usenet crawler account or the free trial they offer? If you’re using the free trial you will hit the API limits after a few hours and RSS Syncs won’t get any results, so things will be missed.

Hi Markus,

No I haven’t as stupidly I deleted the log due to not understanding how the RSS works. I will check tonight’s logs and see. I am a VIP user so shouldn’t be in issue btw.

I don’t understand why when I click on the show in the calendar and then do an automatic search it pulls it down straight away, surely if there was an issue with the profile it would fail?

Thanks

Yeah, if the profile didn’t allow it the automatic search would also fail. If the episode Red on the calendar? If it’s red then Sonarr should grab it (as long as it saw it), but if it’s grey it’s unmonitored and Sonarr won’t grab it.

Yes that’s what I thought. In the calendar it’s definitely red so should be pulling it down. I’ll check the logs tomorrow properly and let you know.

Thanks again for your help.

Hey Markus,

So I’ve looked at the logs from this morning 1.30am and it looks like the RSS feed that it’s pulling down is exactly the same as the one from yesterday at 1700. If I go onto the indexers site and click the URL for the hd feed I can see it’s regularly updating so for some reason the client isn’t pulling the latest feeds down?

Thanks

Just looked at their RSS feed and they’re returning items from 2012, probably from around the time they started the site. Looks like their RSS feed generation is broken and returning the first results, not the latest results. This is something they’ll need to look into (I assume they have a contact form of some sort).

Also a good idea to get a second indexer to avoid one being down and breaking your setup.

That’s strange as I can see on the site that there is a current feed active which is up to date. I’ve posted on the forum to see if anyone can advise. Silly question but how could you tell it was pulling from that date?

<item>
	<title>V.2009.X64.WEB-DL.S02E10-StreamTeam</title>
	<guid isPermaLink="true">https://www.usenet-crawler.com/details/27e91f46b6e9482e8604bde80fe7b370</guid>
	<link>https://www.usenet-crawler.com/getnzb/27e91f46b6e9482e8604bde80fe7b370.nzb&amp;i=6098&amp;r=xyz</link>
	<comments>https://www.usenet-crawler.com/details/27e91f46b6e9482e8604bde80fe7b370#comments</comments> 	
	<pubDate>Sun, 16 Dec 2012 19:12:33 +0100</pubDate> 
	<category>TV &gt; SD</category> 	
	<description>V.2009.X64.WEB-DL.S02E10-StreamTeam</description>
	<enclosure url="https://www.usenet-crawler.com/getnzb/27e91f46b6e9482e8604bde80fe7b370.nzb&amp;i=6098&amp;r=xyz" length="1580171776" type="application/x-nzb" />

	<newznab:attr name="category" value="5000" />
	<newznab:attr name="category" value="5030" />
	<newznab:attr name="size" value="1580171776" />
	<newznab:attr name="guid" value="27e91f46b6e9482e8604bde80fe7b370" />
	<newznab:attr name="files" value="38" />
	<newznab:attr name="poster" value="user@iamnot.com (PowerPost)" />
	<newznab:attr name="season" value="S02" />
	<newznab:attr name="episode" value="E10" />

	<newznab:attr name="grabs" value="59" />
	<newznab:attr name="comments" value="0" />
	<newznab:attr name="password" value="0" />
	<newznab:attr name="usenetdate" value="Sun, 20 Mar 2011 01:49:15 +0100" />
	<newznab:attr name="group" value="alt.binaries.x264" />

</item>

The pubDate is when it was published

Thanks Markus you’re a star. Still waiting to hear back on the forum. Hopefully someone can shed some light on it.

Hi Markus,

Sorry to keep pestering you.

Can you have a look at this and let me know if this could be the reason for it pulling down feeds from 2010 onwards?

https://pastebin.com/1cpsFmFm

Thanks

Joe

Sonarr is request the RSS feed normally, there are no date ranges supplied to that request so it’s 100% on the indexer’s side to return the latest content (I’ve never seen a case where it’s returned old data like we’re seeing here). If the indexer is expecting Sonarr to add additional parameters then they are doing it unexpectedly and not something that Sonarr would send unless added to additional parameters in the indexer settings.

1 Like

Hey Markus,

Thanks for all your help on this but I’m unable to progress this further as no-one is replying on the indexers forum so still not sure if it’s their RSS feed at fault or Sonarr, therefore I will have to reluctantly go back to Sickrage.

Thanks again

You can take the URL that Sonarr reports in the debug logs and put it in your browser (and put the API key back in) and see that they’re returning the wrong data. I know you’re hesitant to take my word for it, but this issue is 100% on Usenet Crawler’s side.

Usenet Crawler has been declining for a while now, there are a ton of alternatives that do the same thing, but better. Even a free trial one one would be good to see how it should work (assuming you don’t burn through your API calls).

Thanks Markus, you were right it was the indexer. I’ve moved over to another and its working correctly.

Cheers for being patient and for the help.

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.