Downloading over and over and over and over and over and over again



Mono Version

OS: unraid docker

**Debug logs**

Description of issue: I dont know why it does this, it downloads the same failing releases (its less then 1mb and 2mb) over and over and over and over again… Its not happening the first time, its just the first time i could log it. Its
happening somehow often - and i dont understand why nzbget it doesnt see it as duplicates

Its about Warehouse 13 S03E09 Die Schatten des Schreckens GERMAN DL WS 1080p HDTV x264-MiSFiTS zip] 0.02 MB

0.02 MB [Warehouse 13 S03E11 Die Muenze des Janus GERMAN DL WS 1080p HDTV x264-MiSFiTS zip]


Is it blacklisted in Sonarr?
Is NZB Hydra returning a different ID for the same NZB each time? If that is it’ll bypass Sonarr’s duplicate detection, I’ve seen it do that in that past.


I dont know. I tried to blacklist some of the things he downloads over and over again. I dont know if i blacklisted that, how can i see?

I guess hydra is returning different ids, because if not, the dupe check of nzbget would work, i guess?

But i know with the “old” version of sonarr i didnt had that problem once, i didnt changed anything, only sonarr. NZB Hydra and radarr is like i installed it years ago - with updates ofc.

Its set to redirect to indexer, should i set it to “get it from hydra”? Would that be better?


Check Blacklist is Sonarr’s Activity.

The ID changing will affect Sonarr, it shouldn’t have any effect on NZB Hydra.

This logic hasn’t changed from v2 to v3.

I can’t see that making a difference, but I don’t know.


Can you tell me what i need to look for to know if the ids change?

If i understand correct, you dotn see a problem ins onarr?

Because - why i dont know - the examlpe i posted here, i cant find it in hydra history… (???)

And yes, hydra is my only indexer.


Does Sonarr’s blacklist have multiple entries for the exact same NZB?

The only want to know for sure is to look at the XML response that Sonarr gets for a search a couple of times and compare the GUIDs for the releases.


Thanks for your time.

U probably right. I found this time he did it again.

Its LuluNZB returning 100000000s of 0,9mb files for most “german” results. All different IDs, ive set lulunzb now to lower prio, lets see. Atleast sonarr and nzbget is doin their job correct.

Maybe you can look at How to correct setup profile and language (small tweaks?)

If you didnt see it.

This can be closed, no sonarr problem, just shitty indexer.


For the record. NZBHydra attempts to recognize duplicates between different indexers (meaning results which are actually the same upload) and filter them out. After that every found result gets a unique ID which which is (globally) unique.


Similar to the logic Sonarr uses to avoid dupes, if it’s the same indexer and a different GUID then it’s a different release even if it has a simile post time as a blacklisted item.


could you set the minimum file sizes in the quality definition page to 30mb (i think thats the smallest)? most episodes are larger than that so it should filter out most of the crap small files

closed #12

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.