Sonarr multiple downloads of same show same quality

Sonarr version (exact version): latest nightly

So I have tried opening about a half dozen requests for help, and I keep getting kicked to the curb, log files don’t provide what is needed to help me. I hope that there is someone patient enough to help me to help capture what is going on so that this problem can get fixed. I am about to scrap my whole setup because I can’t seem to get the help I need to GUIDE ME enough to provide whatever is needed to fix this problem.

the symptoms are

  • my Hydra2 Index server gets disabled.
  • I have every possible version ( different releases, so I may get a dozen copies of the same show/episode. of a TV show downloaded
    *strangest thing is that the show episode downloaded is ONE I ALREADY HAVE TO BEGIN WITH.

I am so lost, I don’t know where to begin. I try to read the log files, but they just point to the problem is external. But when I look at those programs, they seem to be doing what they are ASKED to do.

To try to fix this problem of multiple duplicates, I turned on a feature in hydra that aborts any series or episode that is a duplicate request. THIS was a bandaid at the best solution, but it solved nothing,

The #1 problem is hydra2 keeps getting disabled by Sonarr v3.

So I will start with a small snippet which should help the discussion. I will be happy to pay someone cash via email account to hold my hand,

nbzHYDRA2 is dropping this because it is set to NOT allow duplicate downloads, so it fails the download, Sonarr then marks Hydra as bad and stops using it.

20-9-14 18:37:05.1|Debug|Sabnzbd|Downloaded nzb for episode ‘Bar.Rescue.S04E02.720p.HDTV.x264-2HD’ finished (329541 bytes from http://localhost:5076/getnzb/api/4767315701809300865?apikey=(removed)
20-9-14 18:37:05.1|Info|Sabnzbd|Adding report [Bar.Rescue.S04E02.720p.HDTV.x264-2HD] to the queue.
20-9-14 18:37:05.1|Debug|SabnzbdProxy|Url: http://192.168.1.162:18080/api?mode=addfile&cat=tv&priority=-100&apikey=(removed)&output=json
20-9-14 18:37:05.6|Warn|ProcessDownloadDecisions|Couldn’t add report to download queue. Bar.Rescue.S04E02.720p.HDTV.x264-2HD

[v3.0.3.924] NzbDrone.Core.Exceptions.DownloadClientRejectedReleaseException: SABnzbd rejected the NZB for an unknown reason
at NzbDrone.Core.Download.Clients.Sabnzbd.Sabnzbd.AddFromNzbFile(RemoteEpisode remoteEpisode, String filename, Byte[] fileContent) in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Download\Clients\Sabnzbd\Sabnzbd.cs:line 46
at NzbDrone.Core.Download.UsenetClientBase1.Download(RemoteEpisode remoteEpisode) in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Download\UsenetClientBase.cs:line 79 at NzbDrone.Core.Download.DownloadService.DownloadReport(RemoteEpisode remoteEpisode) in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Download\DownloadService.cs:line 95 at NzbDrone.Core.Download.ProcessDownloadDecisions.ProcessDecisions(List1 decisions) in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Download\ProcessDownloadDecisions.cs:line 77

20-9-14 18:37:05.7|Debug|IndexerFactory|Temporarily ignoring indexer !NzbHydra2 till 9/14/2020 6:42:05 PM due to recent failures.
20-9-14 18:37:05.7|Debug|IndexerFactory|Temporarily ignoring indexer !NzbHydra2 till 9/14/2020 6:42:05 PM due to recent failures.

Here is link to further history on this issue. I look forward to getting this resolved.

Please? :wave:

Just read what I have here, anything??

Maybe complete the help template you wiped out? A lot of people skip over the topics if all info isn’t there.
“Latest nightly” is something that can refer to multiple versions and branches of sonarr, and quickly becomes meaningless depending on which branch you’re on.

The previous topic linked has more info, but died when marcus asked a couple of questions that were never answered and the topic auto-closed…

Template:

Sonarr version (exact version):
Mono version (if Sonarr is not running on Windows):
OS:
Debug logs:
Description of issue:


Additional Information:

  • Information on the log files is here: https://github.com/Sonarr/Sonarr/wiki/Log-Files
  • Make sure debug logging is enabled in settings
  • Post the log file, not a line or two, or just the error from the Logs table
  • Post the full log to hastebin/pastebin/dropbox/google drive or something similar and link it here
  • Do not post them directly here. Post in .txt not .doc, .rtf or some other formatted document

Window10 no mono
Sonarr updates every day, hence the response. It currently has 3.0.3.928.

Check the post I linked, Has everything you need. In fact the last three on this topic all have it. I’ve spent hours documenting, I’ve posted half dozen logs. Hadn’t helped. I’ll be happy to upload again, how about looking at the posted stuff that is on point.

Thanks for replying,
Not that it matters, but I’m running a 100 degree fever for a week with the flu. I’m disabled so it takes extra effort. I’ll upload the debug logs like before when I can stand vertically again.

NOTE: I still have a fever over the last 18 days, so forgive my lack of energy and not posting more quickly. I appreciate your patience.

I have no affiliation with the following site, I just tested fifteen different sites to find a log viewer site that could handle the massive 1K log file. The owner seems legit in the community. Might I suggest you change your FAQ.WIKI to suggest using https://paste.drhack.net/ the site encrypts the log file, you can set the expiration from never to 5 minutes. and there seems to be no set limit, I tested a log file of 8 MEG and it saved fine…

Alternatively, pastebin and 14 other sites I tested won’t allow you to paste past 500k - vs. the log file size hard coded to 1000k. Could you either make the size a variable or break at less than 500k? This is literally why I reluctantly bother pasting the logs - double the work per file.

I saved a copy of all the log files when this thing went south last week, and for your viewing pleasure here was the log file that you have an exact copy of in a post above, with ALL references of Bar.Rescue.S04E02.720p.HDTV.x264-2HD
during the timestamp when the failure happened.

The last thread died not becuase I failed to answer any questions, if that is true, what question didn’t I answer fully? The thread died due to the log file not providing any information that helped diagnose the problem. Hence my repeated questions to see WHAT else I should look for. I still have a copy of log files from that time, but due to the log file limit addressed above, not clear what else to send since the show in question did not show up in other logs. Please if you can help me understand WHAT to else look for in THOSE log files I will be happy to upload them, especially now that I can upload multiple files without risking deleting something by accident

Thanks for your reply and help, like I said I will be happy to donate coin for assistance.

Frankly, changing the log file size or adding ability to set this by the app would be something I could code myself, If someone is willing to chat or talk on the phone about getting my system setup to develop and to understand the process. I realize there is some documentation, but an overview first to be sure I am not insane would be helpful.

cheers!

so i went to tag more shows for download, and i noticed none of the shows I tagged last time seem to be added.
So like every time, I did a manual import and here is what I see

Sigh, like always I have to manually do everything that used to work really well overall. Since adding usenet as a source, sonarr just stopped working properly. I have had two usenet/sonarr experts look at my setup, config, and supporting apps (hydra2 etc) and they verified my settings are the same as theirs. These are folks on discord channel that specializes on home or deducted hosted usenet setups. They are all stumped. I’ve spent hundreds of hours troubleshooting, trying different settings, reviewing logs.

The incoming folder from my news client is doing what it is told, and downloaded the following


I made a copy of the Logs, again. The logs always only show one day of use, in cases like this, would be helpful to have ability to go back further or have a gdg (mainframe speak) or set number of days of history to troubleshoot.

What should I do from here?
What do you want to see?
Thanks

So to recap:

Issues

  1. Not importing after download completes Usenet
  2. Imports rarely map correctly {due to Obfuscation?}
  3. Same show gets downloaded multiple times 2- 30 times! (Note; I believe it is kicking off the download and for some reason thinks it needs to retry with an alternate. Perhaps there is option to turn off the retry on failure?) when this happens, I have witnessed it searching for the same show over and over again.
  4. more to come?

I’m waiting for a reply, so I know what to do, what wrist to provide, anything?

Going to play Devil’s Advocate here: have you tried not using Hydra and just adding the indexers individually? Or even better, start with one indexer and see if that works. Crawling before walking, walking before running kind of approach. Get the very basics working and then grow from there.

Thanks for your thoughts!

It works fine 6 out if 7 days, so a setup problem isn’t likely. Along those lines, I initially set up one index directly initially for two weeks without issues.

But again, this setup worked fine for several weeks initially. I did exactly that - walked before I ran.

Sonarr is sending the download request to sabnzdb, which downloads what it is told.

I haven’t looked into hydra 2,which is all you are suggesting to change. How would it cause these issues? What should I look at?

What do logs say above? Same as before, nothing?

Other ideas to try? Would reddit sonarr support be seen by different sonarr admins?

This not being your first thread, on this matter no less, you absolutely need to include all the requested information. We can’t help you if you can’t help yourself. That coupled with the scattered thoughts, information and everything else makes this impossible to troubleshoot.

To answer some of your questions, we’re not going to make changes to the size of the logs files, allow people to customize the size or walk you through how to do that, it’s unnecessary. We make several suggestions on where logs can be uploaded, for a single log file it’s usually easiest to use something like pastebin, for multiple log files, use something like Google Drive, Dropbox or one of the other dozens of options.

Taking one example from your logs:

20-9-14 18:36:56.9|Debug|DownloadDecisionMaker|Processing release 'Bar.Rescue.S04E02.720p.HDTV.x264-2HD' from '!NzbHydra2'
20-9-14 18:36:56.9|Debug|Parser|Parsing string 'Bar.Rescue.S04E02.720p.HDTV.x264-2HD'
20-9-14 18:36:56.9|Debug|Parser|Episode Parsed. Bar Rescue - S04E02 
20-9-14 18:36:56.9|Debug|Parser|Language parsed: English
20-9-14 18:36:56.9|Debug|QualityParser|Trying to parse quality for Bar.Rescue.S04E02.720p.HDTV.x264-2HD
20-9-14 18:36:56.9|Debug|Parser|Quality parsed: HDTV-720p v1
20-9-14 18:36:56.9|Debug|Parser|Release Group parsed: 2HD
20-9-14 18:36:56.9|Debug|ParsingService|Using Scene to TVDB Mapping for: Bar Rescue - Scene: 4x02 - TVDB: 3x22
20-9-14 18:36:56.9|Debug|AcceptableSizeSpecification|Beginning size check for: Bar.Rescue.S04E02.720p.HDTV.x264-2HD
20-9-14 18:36:56.9|Debug|AcceptableSizeSpecification|Max size is unlimited, skipping size check
20-9-14 18:36:56.9|Debug|AcceptableSizeSpecification|Item: Bar.Rescue.S04E02.720p.HDTV.x264-2HD, meets size constraints
20-9-14 18:36:56.9|Debug|AlreadyImportedSpecification|Performing already imported check on report
20-9-14 18:36:56.9|Debug|AlreadyImportedSpecification|Skipping already imported check for episode without file
20-9-14 18:36:56.9|Debug|LanguageSpecification|Checking if report meets language requirements. English
20-9-14 18:36:56.9|Debug|MaximumSizeSpecification|Maximum size is not set.
20-9-14 18:36:56.9|Debug|MinimumAgeSpecification|Checking if report meets minimum age requirements. 3639816.6
20-9-14 18:36:56.9|Debug|MinimumAgeSpecification|Release is 3639816.6 minutes old, greater than minimum age of 60 minutes
20-9-14 18:36:57.0|Debug|QualityAllowedByProfileSpecification|Checking if report meets quality requirements. HDTV-720p v1
20-9-14 18:36:57.0|Debug|ReleaseRestrictionsSpecification|Checking if release meets restrictions: Bar.Rescue.S04E02.720p.HDTV.x264-2HD
20-9-14 18:36:57.0|Debug|ReleaseRestrictionsSpecification|[Bar.Rescue.S04E02.720p.HDTV.x264-2HD] No restrictions apply, allowing
20-9-14 18:36:57.0|Debug|RetentionSpecification|Checking if report meets retention requirements. 2527
20-9-14 18:36:57.0|Debug|SeriesSpecification|Checking if series matches searched series
20-9-14 18:36:57.0|Debug|DelaySpecification|Ignoring delay for user invoked search
20-9-14 18:36:57.0|Debug|HistorySpecification|Skipping history check during search
20-9-14 18:36:57.0|Debug|MonitoredEpisodeSpecification|Skipping monitored check during search
20-9-14 18:36:57.0|Debug|DeletedEpisodeFileSpecification|Skipping deleted episodefile check during search
20-9-14 18:36:57.0|Debug|DownloadDecisionMaker|Release accepted

This release is accepted because there isn’t an an existing file, nothing in queue and any other checks passed. If Sonarr is not able to import files (your logs also show import failures), then it’s possible Sonarr will try to get the same release again at another time.

20-9-14 18:37:05.6|Warn|ProcessDownloadDecisions|Couldn't add report to download queue. Bar.Rescue.S04E02.720p.HDTV.x264-2HD

[v3.0.3.924] NzbDrone.Core.Exceptions.DownloadClientRejectedReleaseException: SABnzbd rejected the NZB for an unknown reason
   at NzbDrone.Core.Download.Clients.Sabnzbd.Sabnzbd.AddFromNzbFile(RemoteEpisode remoteEpisode, String filename, Byte[] fileContent) in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Download\Clients\Sabnzbd\Sabnzbd.cs:line 46
   at NzbDrone.Core.Download.UsenetClientBase`1.Download(RemoteEpisode remoteEpisode) in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Download\UsenetClientBase.cs:line 79
   at NzbDrone.Core.Download.DownloadService.DownloadReport(RemoteEpisode remoteEpisode) in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Download\DownloadService.cs:line 95
   at NzbDrone.Core.Download.ProcessDownloadDecisions.ProcessDecisions(List`1 decisions) in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Download\ProcessDownloadDecisions.cs:line 77

20-9-14 18:37:05.7|Debug|IndexerFactory|Temporarily ignoring indexer !NzbHydra2 till 9/14/2020 6:42:05 PM due to recent failures.

Sonarr should not be disabling an indexer because the download client rejected the release, this is a bug we can fix.

  1. 20-9-14 18:37:32.6|Error|DownloadedEpisodesImportService|Import failed, path does not exist or is not accessible by Sonarr: Z:\2Finished\Bar.Rescue.S04E01.HDTV.x264-SYS.2. Unable to find a volume mounted for the path. If you're using a mapped network drive see the FAQ for more info Is Z:\ a network drive? Have you read the FAQ entry on that? If not, why can’t Sonarr see those folders?
  2. Obfuscation shouldn’t be a problem, we’ll need to see logs where a file that Sonarr could access the folder fails.
  3. Possible related to the lack of importing, once the import issues are solved we can dig into that further.

At this point, disable all your indexers and work on the import issue, that’ll cut down on the logs that are created and the noise in those logs. Copy all your log files somewhere and tell Sonarr to clear the logs, that way you’ll still have them to refer back to, but won’t look at old logs by mistake.

Also enable trace logging, we may not need them, but it’s better to have and not need.

As for Reddit, you’ll definitely get a different group of people there (some are on both), but you’re going to need all the same information as you need here.

A single log file especially trace logs is too big to put each file into Pastebin or any other site mentioned. So every log file must be cut onto 2-3manually. I provided four options to fix this. It sounds like you chose none of them. The last suggestion, using the site link, would solve it for you. (Your welcome.) just have to suggest it. The third option was I offered to join you as a developer donate my time possibly fix that code myself - that way instead of ‘my ramblings’ I can read the source code and understand how to troubleshoot my problems.

I think since I’m one of only a small user base who wants to watch discovery shows that aired from 2000 to 2016, it looks like most of the userbase watches current Tv shows, so RSS feeds handle things wonderfully.

The bug you highlighted was what my last thread was about, I described watching it ignore the source after a failed request. The request failed BECAUSE IT WAS A Duplicate. The USENET client had the ability to refuse the file if it is detected to be a duplicate. This affected one indexer and I could get banned. the 1st thread I posted a while ago detailed this.

I’ll turn down the noise, but I’m losing functionality/ability to get shows.

I will look into why it can’t access the file. Read the faq, I made suggestions for sonarr admin to modify the docs to help users troubleshoot further, this was years ago. I have extensive experience troubleshooting/designing /automating, and to answer your other questions, ž is mapped drive to the physical host. Sonarr is running as virtual on the physical, where plex lives. It starts as a service, the service is set with a user account with change access through the share and through the NTFS permissions. BTW, I can manually import these files, which means it is not a permissions /account issue. Otherwise, nothing would import. I’ll investigate to see if it is a file lock issue and the process owner locking it. I’ll check

I’ll add logs for the obfuscation fail. Thanks for the reminder.

I am truly sorry for rambling - it was my honest attempt to provide information that might help troubleshoot or help my understanding on how sonarr works / is written so I can troubleshoot these things myself. My pervious posts had context with the problem, or they were troubleshooting steps competed. My hope was this would help you narrow the scope, and your reply had some great leads, suggestions, and yes, I’m sorry I didn’t include the template. I’m usually good at that. Lesson learned, ok?

Ok, I am finally starting to feel better.
Everything is a mess, so I am just starting slow.

My download folder had hundreds of entries, some with empty folders, some with ‘xtras’ still in the folder. I created a script to clean up the xtra files (subs, sfv,nfo,idx,sample files, and srr files) and a script to find and remove empty directories. This left several dozen that failed to import because the filename was obfuscated. 100% If you can fix this bug, then I think this will fix my problem.

For anyone else who runs into having obfuscated files, or extra files from SABnzbd, simple fix:
I found in SABnzbd, the document I followed to configure it failed to mention categories needed to be all set to +delete instead of the default. Further, this app you can only make one change at a time then hit save. On this page, if you change everything, click on the save next to one of them, it will only change the line you were on. (strange to me, but I digress)

I am going to not turn on logging to trace, as I think we are most of the way there, cause this slows down processing to a crawl/unusable on my high-end system.

I will report back… stay tuned.

edited: spelling.

You posted this earlier: “This release is accepted because there isn’t an an existing file, nothing in queue and any other checks passed. If Sonarr is not able to import files (your logs also show import failures), then it’s possible Sonarr will try to get the same release again at another time.”

There is a logic problem here. There was a failed import that eventually caused it to find another download due to the failure. In my case fixing the failure due to the client rejection should help it from happening, but shouldn’t a retry not proceed until an import failure is handled by the user? A true failure to download I can agree this makes sense.

Can we also fix import on sonarr so when an obfuscated file is found and the containing folder had the season/episode information, allow this to import automatically? Also, on Sonarr when I would click on a download that is ready to import, it gives an erroneous error:

There IS a video file, it is just named incorrectly or unexpectedly. I can’t tell you how many thousands of files I manually renamed to make this work. If the folder it is in HAS the season/episode information, you should be able to automatically import. if not, can’t you show a list of all video files and be able to select it? That would be a HUGE improvement, as I could finally fix everything IN Sonarr instead of having to go to logs, pull up file manager, try to trace things. Imagine how many fewer posts users would have here trying to figure this out. Ok maybe just me, since I seem to be the only one who ever ran into any of these issues (slight sarcasm)

By the way, why would this filename fail to import: galactica.80.s01e01.1080p.bluray.x264-psychd
Is it the lowercase letters?

You can upload raw files to pretty much any cloud provider and link people to them, both options I mention allow that, plus One Drive and Mega.

A failed import will not lead to it finding another release, unless the download is no longer in SAB’s history (Sonarr only checks a limited number of them).

No that means Sonarr doesn’t see the file at all, it doesn’t care how it’s named, at worst if it’s named in a way it can’t process you’ll get an error to that effect, in this case Sonarr cannot see the file.

It already does this. You’re problem is Sonarr can’t find the file.

Probably the random 80 in the file name, but check the logs.

the logs just say unknown series.

So far things are better since I changed SAB to delete+. but I will know more in a week.

I apologize, I need to learn how to quote properly on this forum. I tried with this post to click REPLY to your quotes to me. It doesn’t populate anything in the entry box like it did above. Further, clicking quote does nothing where there isn’t anything to quote… i have so much to learn still, so thanks for patience.

One you can comment on above, here are a few more :

1)Is this what I’ve would expect to see when importing a season at once? (I feel like I need to uncheck episodes not contained in the file on the left, but the left input file gives me no idea what episode it should be since it is all seasons.

  1. I have a ton of errors with access. I verified again the service account is set with user ID with full permissions on folder, file and share. I get no errors on sonarr about this in gui, shouldn’t I??

  2. I edited the root mappings, now has one for each directory used (input /output)
    Screenshot_20200930-234705_AnyDesk

More to come if no reply soon :stuck_out_tongue: I’ve spent twelve hours on this so far today

  1. Season packs cannot have obfuscated files, Sonarr has no way to decipher which file is which in that case, your indexer should not be creating full NZBs for those with no way to reverse them.

  2. Errors where?

  3. Root folders are where Sonarr moves files to (well the series folders inside), don’t create root folders for the location where your download client puts things.