Delay profile usage confusion

Since I’m missing a couple of old episodes for different TV series in my collection and Usenet doesn’t really have them now I decided to adding torrents into the mix. Delay profiles are new to me and based on only basing my understanding off of the articles I read about it.

So I set the default profile to prefer usenet, 120 minutes delay for usenet and 180 minutes for torrent, as in Example 2 in your article. Then I hit search all missing in the wanted section. After doing this, Sonarr starting sending a lot of downloads to Deluge and some to NZBGet. So what happened here, why didn’t the delays work? I thought that everytime a release is found (regardless if it’s an upgrade or anything), the countdown of the timers start before they get downloaded? Sorry if I’m miasing anything here.

Delays are for new content, anything older than 180 minutes would be fair game for either client, it’s age of release not age since Sonarr first saw it. In addition any search triggered directly by the user (like in your case) bypasses the delay.

Preferred protocol only takes effect if the qualities are the same, a 1080p torrent would grab over a 720p usenet release (assuming both are wanted).

I see. Does that mean that if a release is older than 180 minutes and Sonarr picks up the release late, then the delay profile will not have an effect?

So in my current settings, what will happen if a 1080p torrent and a 720p usenet release are released at the same time? Will it wait for 180 minutes to grab the torrent?

And would example 2 in this article have the same “prefer usenet” effect as example 3 with the additional feature of not getting lower quality releases when multiple qualities are released?

IIRC it’ll grab the torrent after 120 minutes if there isn’t a 1080p usenet release because the usenet release is ready, but I could be wrong.

Yes, that all works the same.

Is there a documentation to this behavior other than the one I linked to?

What do you exactly mean by “delays are for new content, anything older than 180 minutes would be fair game for either client, it’s age of release not age since Sonarr first saw it.”?

No, that’s the documentation.

Not sure how to expand on that, if you search for something by pressing search in Sonarr you’re telling Sonarr to find something and the delay is ignored (even if it’ 5 minutes old), the preferred protocol still applies (as it always does).

If Sonarr was unable to connect to your indexer for 6 hours and then suddenly was and performed an RSS sync it’d grab the best quality release available from all indexers because the delay timer starts when the oldest posted time, not the time Sonarr first saw it.

Ok, so technically the delays only take effect for new releases through RSS feeds, correct?

I have another problem that just happened now when Stranger Things S03 was released. I’m not sure what happened with E02 and E03 as shown:

debug logs: https://www.dropbox.com/s/tbnialflpakkw4n/debug%20logs.zip?dl=0

So at 4:39PM a lower quality torrent was grabbed (because there was no NZB yet) and then I don’t understand what happened anymore. I’m guessing my post-process script is converting the 720p torrent release while the 1080p nzb finished and started post-processing as well. But the end result was the 720p mp4 from the torrent. So it looks like the 720p torrent mkv was not deleted successfully and was not upgraded to the 1080p release. Why so?

While trying to solve this issue, I decided to delete the two episodes and initiate another search. The highest quality releases were grabbed and sent to my NZB Client as expected. My NZB client is downloading them now but I don’t see anything in Sonarr’s queue. In the history tab of the Activity section, I do see the grab events.

Another set of debug logs:

If it’s seeding Sonarr won’t touch the torrent files.

Then Sonarr hasn’t gotten updated information from the download client, this will happen if a long running import or synchronous Custom Script is keeping the Check For Completed Downloads task from finishing. That’d also prevent Sonarr from processing the 1080p releases that it grabbed.

The 720p mkv delete event there points to the mkv location of my tv shows, not my torrent completed folder. The files that are seeding are in the torrent completed folder. Is this because I have hardlinking enabled (which I should) for torrents? Regardless of that setting though, Sonarr should be able to delete the hardlinked file, shouldn’t it? Does Sonarr show those delete events regardless if they’re successful or not?

As for the activity queue not being updated and 1080p releases not being processed, will they eventually get processed when that custom script is done? Does Sonarr put everything in a queue so that they get processed sequentially instead of being processed in parallel?

No it shouldn’t, unless it’s locked, but then it’d fail to delete.

No, only if it is deleted.

As long as they still exist in the download client.

Not a queue, but they are processed sequentially.

So do you know what really happened in ny case here? Sonarr shows it deleted the torrent file but never got updated with the 1080p releasse.

What do you mean as long as they’re in the download client? What of the custom script is not yet done processing one release while the download client finishes downloading other releases? What happens then?

What reason does Sonarr say the file was deleted? For an upgrade, because the file was missing from disk or something else?

What else could it mean, it’s in the download client’s queue/history, save for some edges where lots of history in SAB will cause Sonarr not to see it, but that’d be 30 complete/failed downloads in Sonarr’s category.

It should work just fine, keeping in mind the issue with history + SAB I mentioned above.

[quote=“markus101, post:13, topic:22683, full:true”]
What reason does Sonarr say the file was deleted? For an upgrade, because the file was missing from disk or something else?

I checked sonarr.debug.17.txt from the first debug_logs.zip I sent you above and saw these relevant logs for the 4:39PM torrent release of S03E02:

19-7-4 17:26:29.4|Debug|EpisodeFileMovingService|Hardlinking episode file: /downloads_complete/sonarr/Stranger.Things.S03E02.720p.WEBRip.X264-METCON[eztv].mkv to /tv/Stranger Things/Season 03/Stranger Things - S03E02 - The Mallrats [WEBDL-720p - METCON].mkv
19-7-4 17:26:29.4|Debug|DiskTransferService|HardLinkOrCopy [/downloads_complete/sonarr/Stranger.Things.S03E02.720p.WEBRip.X264-METCON[eztv].mkv] > [/tv/Stranger Things/Season 03/Stranger Things - S03E02 - The Mallrats [WEBDL-720p - METCON].mkv]
19-7-4 17:26:29.4|Debug|DiskProvider|Hardlink '/downloads_complete/sonarr/Stranger.Things.S03E02.720p.WEBRip.X264-METCON[eztv].mkv' to '/tv/Stranger Things/Season 03/Stranger Things - S03E02 - The Mallrats [WEBDL-720p - METCON].mkv' failed.

[v2.0.0.5322] Mono.Unix.UnixIOException: Invalid cross-device link [EXDEV].
  at Mono.Unix.UnixMarshal.ThrowExceptionForLastError () [0x00005] in <82102001eb844bc689cce46fc9c0d1f5>:0 
  at Mono.Unix.UnixMarshal.ThrowExceptionForLastErrorIf (System.Int32 retval) [0x00004] in <82102001eb844bc689cce46fc9c0d1f5>:0 
  at Mono.Unix.UnixFileSystemInfo.CreateLink (System.String path) [0x0000c] in <82102001eb844bc689cce46fc9c0d1f5>:0 
  at NzbDrone.Mono.Disk.DiskProvider.TryCreateHardLink (System.String source, System.String destination) [0x00013] in C:\BuildAgent\work\5d7581516c0ee5b3\src\NzbDrone.Mono\Disk\DiskProvider.cs:182 

19-7-4 17:26:40.0|Debug|DiskProvider|Setting permissions: 0664 on /tv/Stranger Things/Season 03/Stranger Things - S03E02 - The Mallrats [WEBDL-720p - METCON].mkv
19-7-4 17:26:40.0|Debug|EpisodeService|Linking [Season 03/Stranger Things - S03E02 - The Mallrats [WEBDL-720p - METCON].mkv] > [[3196]The Mallrats]
19-7-4 17:29:48.9|Debug|MediaFileTableCleanupService|File [/tv/Stranger Things/Season 03/Stranger Things - S03E02 - The Mallrats [WEBDL-720p - METCON].mkv] no longer exists on disk, removing from db
19-7-4 17:29:48.9|Debug|EpisodeService|Detaching episode 3196 from file.
19-7-4 17:29:49.0|Debug|SubtitleFileService|Deleting Extra from database for episode file: [882] Season 03/Stranger Things - S03E02 - The Mallrats [WEBDL-720p - METCON].mkv
19-7-4 17:29:49.0|Debug|OtherExtraFileService|Deleting Extra from database for episode file: [882] Season 03/Stranger Things - S03E02 - The Mallrats [WEBDL-720p - METCON].mkv
19-7-4 17:29:49.0|Debug|MetadataFileService|Deleting Extra from database for episode file: [882] Season 03/Stranger Things - S03E02 - The Mallrats [WEBDL-720p - METCON].mkv

So it looks like the 4:39PM torrent release got downloaded, post-processed by my custom script (converted from mkv to mp4), and the mkv got deleted properly. But why is there an error there regarding hardlinking?

I also checked, in several log files, that the 5:15PM 1080p NZB release got sent properly to NZBGet (I’m not using SAB, FYI) but it looks like the logs say that it keeps being on a loop? I see lots of these:

19-7-4 18:42:16.0|Debug|DownloadDecisionMaker|Processing release 'Stranger.Things.S03E03.Chapter.Three.The.Case.of.the.Missing.Lifeguard.1080p.NF.WEB-DL.DDP5.1.x264-NTG-Scrambled' from 'NZBgeek'

I don’t think it ever finished and I don’t understand why. Would Sonarr need the original mkv file for it to be able to update the release?

So as long as the release is in the download client’s history, Sonarr will process them sequentially? Is it better to let Sonarr handle all the history of the download client to avoid any conflicts?

Because you can’t hardlink to different devices (volumes) it looks like.

Huh? If it ever grabbed it, processed anything else or completed the RSS Sync then it finished, that’s just telling you it’s parsing/figuring out what to do with that release.

It’ll fail to delete a file that doesn’t exist, but that shouldn’t be a long term issue, the logs would show something that it failed to import because it was missing.

As long as it’s in the limited history Sonarr knows about, for SAB/nzbget that’s the last 30 items in the category, if you’re grabbing more than 30 releases between Sonarr processing them you’ll want to remove completed downloads (option in Sonarr) so things that are processed are removed and Sonarr doesn’t lose track of things.

Not sure what you mean, Sonarr should only process releases in it’s category.

Does that mean that if it fails to hardlink it copies the file instead so that it\ll push through with the rest of the post-processing?

I know but what I’m saying is that that log is the last event I see for that specific release. I didn’t see anything that said it post-processed that file successfully or not (unlike the other releases). Remember, I ended up with a 720p torrent mp4 file after all these releases even though in the History, it says it downloaded the 1080p release. I’m just trying to understand what really happened in the background that could’ve caused the issue.

Right. So this isn’t really what caused the whole issue.

Ohh ok, I didn’t know that Sonarr had a limit of 30 history items only. Maybe you could put a note in the GUI about that or something? Just a thought.

What I mean is what if SAB/NZBGet removes the item from its history right after download completion? Sonarr will then not know about this. So I just thought that it’s better to just let Sonarr do all the completed download handling stuff instead of adding the processes done by the download client into the mix. Does that make sense?

Yes.

Right, if you did that Sonarr would lose track of it and never import it.

So I was trying to download the whole season 1 of The Boys now and as I said earlier I have my delay profile set to this:

image

For the first two episodes, the nzb’s were downloaded but then they were not healthy. The next try of Sonarr were torrents though even though there were still nzb releases that are available for Sonarr to try. Why is this? Why won’t Sonarr try all NZB releases first for the highest quality available before resorting to torrents?

How do you know there were NZBs available? Did you look at the debug logs to see why Sonarr grabbed that release?

I’m not going to guess at why, look at the debug logs, they will tell you everything you need to know.

Either they were rejected because they had failed (NZBs on a different indexer don’t mean Sonarr will try each one, as they are often the same release) or they didn’t exist, debug logs would tell you.

If you have specific debug logs you have looked at and can’t tell why Sonarr chose one release over another I can take a look, but I’m not going to guess at what a problem may be without looking at logs and I’m not going to pour over multiple files of logs deciphering them. Every release processed by Sonarr will indicate whether it was accepted or rejected (and why it was rejected) in the debug logs.

TLDR: debug logs will tell you why.

I just did a manual check on that specific episode and saw a lot of nzb releases that were not downloaded.

I understand that the debug logs will show everything but the point of my post (which I should’ve reworded correctly) was to understand how Sonarr “prefers” Usenet over Torrent when that Prefer Usenet is chosen. I didn’t mean to do an investigation (even though I asked “why”, I know) of what caused this, so no debug logs needed.

I only have 1 NZB indexer and multiple torrent indexers. In my case, for each NZB release shown in the manual search tab of an episode, those are all different releases, correct? There were multiple 1080p (the highest quality in the profile assigned to that show) NZB releases at that time. If the 1080p torrent release has a higher quality than the other 1080p nzb releases, will Sonarr prefer the torrent?

Short answer, it depends.
Long answer, it depends on size, time and release name, release with similar sizes posted at approximately the same time on different indexers are treated as being the same release. If a release failed and was blacklisted the search results would indicate that results from other indexers were rejected because they were blacklisted.

Higher quality because its from a different source (WEB* vs HDTV)? If so, quality wins, it always wins.