Sonarr sending corrupt NZB file to NZBGet

Recently moved from Sickrage to Sonarr. Transition was reasonably quick and painless. Kudos to the Sonarr team. The user interface is exceptional. No issues whatsoever for the first week while exclusively testing Usenet indexing and downloads. Blown away by the thoughtful workflow implemented in this software.

This second week I’ve tried torrent indexing via Jackett for back-fills. When torrent indexing is enabled Sonarr is sending garbage NZB files to NZBGet. Seems one should have nothing to do with the other but here we are. The garbage files are consistently prefixed with bencode data followed by other binary data. Buffer caching issue in the mono / Sonarr stack somewhere? That’s what appears to be happening purely as a guess.

I’m making the assumption that Sonarr passes a corrupt NZB file to NZBGet because it seems unlikely that NZBGet was spontaneously creating these junk NZB files with bencoded data only after I enabled Jackett indexing in Sonar. I may be pointing fingers in the wrong place.

Removing the Jackett indexer in Sonarr appears to correct the behavior and Sonarr / NZBGet are working as expected once again. Anyone have experience with this? I’d really like to use torrents for daily back-fill. I may consider enhancing Jackett so that it peers within the torrent for a file match instead of just searching the torrent title.

Environment
Linux Host
4.0.5-gentoo #1 SMP Wed Jun 17 22:13:42 EDT 2015 x86_64 Intel® Core™2 Quad CPU @ 2.40GHz GenuineIntel GNU/Linux

Sonarr
Version: 2.0.0.3357
Mono Version: 4.0.3 (Stable 4.0.3.20/d6946b4 Sat Oct 17 21:13:25 EDT 2015)
AppData directory: /sonarr/.config/NzbDrone
Startup directory: /usr/share/NzbDrone

NZBGet
Currently installed: 16.0

Did the corrupted NZBs all come from the same indexer?

Adding a torrent indexer via Jackett wouldn’t cause NZBs to become corrupted, they are completely separate things, sounds like coincidence that disabling Jackett solved the NZB issues.

Actually, I think he added Jackett as Newznab indexer instead of Torznab indexer.

1 Like

Thanks for taking the time to help me.

The corrupt NZB files were marked as originating from dog (mostly) and some from usenet-crawler. None were tagged as jackett. There were typically two failures followed by a successful NZB transfer into the NZBGet queue. Sonar tracks each attempt as an activity.

NZBGet would log two failures followed by a success. The NZBGet folder saves each file with a .queued extension. This appears to be a benign issue because I thought the NZB failures should have an .error extension. Maybe this is only true after the NZB is successfully queued and then fails.

Not sure about nzbget’s handling of that, its definitely odd that you saw those errors, but Jackett shouldn’t have any effect.

Yup. I think you are right … very likely finger problems are the root cause. I’ll leave the current usenet-only configuration alone for a bit and then try configuring torznab again ( or likely for the first time :wink: ) when the environment has hardened more.

I’m thrilled that Sonarr has the framework and configuration options that should allow me to use torrents for back-fill.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.