Can Sonarr grab nzb's quicker than Sickbeard?

Its getting harder and harder to grab stuff from usenet before its taken down. Many times Sickbeard will find an nzb, send it to sab, and of course the DL will fail because its already been taken down. SB has an option called “Search Frequency” set to 60 minutes by default. That seems reasonable, doing a search every 30 min would seem to anger the nzb websites. But its the backlog searches that I think needs to be more often. SB does this once per day (no option to change) and most of the failures happen on these downloads.

Can (or should) Sonarr do a backlog search more often?

The default RSS Interval in Sonarr is 15 minutes.
Sonarr doesn’t do automatic backlog searches (except in some very special cases), mainly because it’s pointless, if the rss doesn’t show something new a backlog search wouldn’t either.
This means that, if nothing gets downloaded, we hit the indexer 96 times per day for the rss.

Another option is to add some / more block accounts and you will have alot less problems with takedowns…

Do you know if thats different than how Sickbeard does its searches? Do they both just monitor RSS? I know SB every once in a while will find a load of nzb’s that weren’t recently posted. I assume thats the backlog search.

I’m pretty sure I’d like to switch so I guess it doesnt really matter, I’ll find out soon enough. :smile:

SB does do automatic backlog searches, but Sonarr doesn’t, if Sonarr is running all the time there is no sense in doing so, it wastes API calls (extra load on indexers) and after one search its unlikely to find anything new the next time.

You can search within Sonarr and can even do a “manual” search which shows you all the results and you can pick the one you want.

Are you sure it’s getting removed from usenets before you download it and it’s just not being fully populated yet and failing for that reason?

I personally haven’t gotten a show to not download unless it’s an older show. Never failed for a show that was recently released.

But yeah, the backlog ones will fail, those are the ones that have been removed. But luckily Sonarr has torrents now. It’s much easier to fill those backlogged shows through torrents. I went from 8 pages of missing episodes down to 3 pages within 24 hours of upgrading to the developer branch and adding torrents to my Sonarr.

Oh its completely gone, often in 24 hours. Not a trace of it left. May I ask who your usenet provider is?

I also RARELY run into takedown issues. The only time I have problems is, like kelmino, on older items or very obscure items that just haven’t been uploaded.

I’m with thundernews, for no particular reason other than they were cheap on Black Friday 2013. I don’t use a block/backup account.

Edit: actually, that’s not quite true. I do use a backup account… but it is also thundernews and I don’t pay extra for it. I just define their US server as my primary and their EU server as backup and give each 25 connections (they allow 50 but when I use more than 25 in this capacity I was constantly getting errors). I see from my logs that the EU server gets quite a bit of use, so maybe those were because they were taken down on the US server? Beats me.

I highly recommend using a Dutch provider and then one of the next-generation aggregate providers for a backup block account (like the South African one). I don’t like throwing names around on the internet but you can do your own research.

I use supernews, I used to use usenetserver but switched because supernews was a little cheaper.

I also do have a US and their EU as a backup so maybe that’s helping, but I’d be more inclined to think it hasn’t been propagated down to your usenet servers yet. It might actually help to put a little of a delay in yours and see if that helps. Maybe something like 15-30 minutes? It’s quite possible that you’re actually searching too much and you’re pulling them down before they’ve been properly propagated. It’s almost impossible for a TV studio to make a request, the usenet provider get the request, and then take it off of their servers in the time it takes for you to pull down a 500mb-2GB file.

I know Sonarr can delay NZB files, I’m not sure if Sickbeard can. I switched from Sickbeard to Sonarr a few months ago and couldn’t be happier. It’s so much smoother, works very nicely, and is much more stable then my sickbeard ever was. Though I was on their developer branch so that may have had something to do with it (I’m also on the Sonarr developer branch).

I’ve never missed a single aired show with sonarr. There’s only one usenet service that pulls down a show <60 min. Otherwise there’s another issue at play.