I recently switched to sonarr and nzbget and really love them both. But, that is to say that after doing this stuff for years and years, I’m a newb again.
I seem to be running afoul of nzbget deleting/hiding duplicate downloads. When this happens, sonarr doesn’t understand what’s going on and considers the episode to still be in the download.
Checking my history, the first instance of downloading this release, did fail.
How come sonarr is sending what nzbget considers (from the log) ‘exact same content’ ? I assume I’m not understanding how this should work, or the config options about black listing?
How do i get both functions to play nice, I do want dupechecking on in nzbget because I have other more manual functions that I depend on this feature for.
What am I not understanding about how sonarr chooses nzbs here?
Sonarr won’t automatically send the same NZB if it fails and you have failed download handling enabled (if its off the blacklist check is skipped and nothing would be blacklisted because its off).
What are your download handling settings set to (screenshot)?
If you do a manual search in Sonarr and queue the same nzb then I would expect this to happen.
It chooses them based on many rules, following the qualities set in your profile and that it hasn’t failed the same release before, based on the name and age of the nzb, since the age of the nzb is usually tied to the uploaded content it shouldn’t cause this to trigger a dupe check (unless you have an indexer that exposes the index date, not the uploaded date).
Thanks for the speedy reply. These are happening when I kick off an automatic search for the episode(s).
I note that the release name is the same in the nzbget logs.
Here’s a screen shot:
EDIT:
I should probably mention that I’m getting this nagging feeling that I messed with those settings when I first installed sonarr and didn’t REALLY understand what was going to happen. And of course I don’t know what the defaults are, so I’m thinking I caused this.
Nothing too serious, all those settings will work as expected, will handle failed download and by extension, blacklist and respect the blacklist.
Are those duplicated items blacklisted?
If you look at the activity for the episode are there multiple grabs with the same name?
If you look at manual search via Sonarr are there multiple items with the same name, but different ages?
When I check the activity, I do see a bunch of grabs for a release with the same name but with different ages (one day difference in some cases). EDIT: It’s slightly awkward to tell the ‘age when grabbed’ in activity since I’m comparing different days but they are in the same close range.
Yes, when I look at the blacklist, I see one entry, from the day the first failure happened.
When I do a manual search from within sonarr, I do see different ages for the same name. It looks like just two different ages though, 623d and 624d. Multiple entries because of multiple indexers of course.
Yeah, I want to expose the underlying date as well.
In the manual search results are the others rejected because of the blacklisted item?
Do multiple indexers have multiple of the same title on different days?
Currently things are treated as being blacklisted if its within a 4 day window (plus or minus two days). so 623 and 624 days should both be blacklisted if either one was blacklisted.
When I search manually, there are 8 nzb’s that have the same release name.
(Actually, 6 have the exact same name, and 2 have the same name but without .'s.
Here’s the summary:
623 days / Usenet-crawler / 2.2 GB / Blacklisted (no periods in name)
625 days / Usenet-crawler / 2.3 GB / Blacklisted (no periods in name)
623 days / PFmonkey / 2.5 GB / Retried too many times
623 days / nzbs.org / 2.5 GB / Retried too many times
623 days / dognzb / 2.5 GB / Retried too many times
625 days / dognzb / 2.5 GB / Retried too many times
625 days / pfmonkey / 2.5 GB / Retried too many times
625 days / nzbs.org / 2.5 GB / Retried too many times
Ahh this makes sense, one of the two without periods was tried first, it failed and both were blacklisted because Sonarr treated them as the same release (one of them might actually be okay).
The ones with periods are being treated as the same release, but not the same as the release without periods (which makes sense since the titles are now different), they’re all treated at the same release because the dates are close (again, one might be okay).
Usenet crawler seems to strip periods and replace them with spaces, this might be an option on their end (it was back in the day on nzbmatrix). If it doesn’t have an option you may want to disable it since its going to keep causing this issue (assuming this wasn’t a one-off).
Because the content is actually the same in the nzbs its going to keep rejecting them and eventually Sonarr realized it was retried a bunch of times and stopped trying to grab it.
At this point I’d recommend trying both of the releases (grabbing them manually) and seeing what nzbget says (and if it complains, force them to download anyways), I don’t know if the content in the two different releases (different days) is the same, but this is the best way to get this episode.
Two issues here, one is the altered name, we might be able to deal with it, but for now, its not something we’re looking at. The other issue is treating releases on different days as the same release, when there is a low chance that they actually are, in this case we do have plans to change it, but its not something we’re able to fix immediately.
I’m not certain of this, but it seems to me that nzbget may have considered all of those as the same and that’s a wrinkle that’s involved. Do you think that’s correct? If so then that disagreement between sonarr and nzbget would contribute to this wouldn’t it?
All the files with the same age with almost the same name (ignore the space vs period issue) are the same in your manual search are the same, so I’d expect one failure to cause the dupe check to catch it, but not for the releases of another day.
Theres not really a disagreement that we can see here, its a naming issue, nzbget looks at the files within and Sonarr doesn’t. If get is treating the content of an nzb from 625 days and 623 days as the same then it sounds like its detection is wrong (i’m not sure it is, you’d have to test by manually grabbing).
I feel like I’m missing something here. Is there a way to avoid this? Usually failed download handling kicks in and just handles it all, but I had to keep checking on this and doing another automatic search. (And the same happened on other episodes of the same show). If there isn’t, then why is this happening only rarely?
Not really sure what to add here, I’ve explained why its having issues, disabling usenet crawler will help with the different named releases. But until we make changes to the blacklisting detection you’re going to have issues with the same named releases coming out within 2 days of each other.