Check For Finished Download task stuck when file is post-processing

Sonarr version (exact version): 2.0.0.5322
Mono version (if Sonarr is not running on Windows): 5.20.1.19
OS: Linux Docker: https://github.com/mdhiggins/sonarr-sma
Debug logs: https://www.dropbox.com/s/3k64m9cxmffrtij/debug_logs.zip?dl=0
Description of issue:

I have the post-processing script (sma) used in Sonarr. When a certain media file gets post-processed for a long time (1 hour+ because of ffmpeg conversion), the Check For Finished Download task seems to be stuck and does not update the Activity Queue but updates the Activity History tabs. Here’s an example:

image

If you notice this picture, the Queueu is not being updated even though everything is done in NZBGet as shown here:

image

After the conversion, the Activity Queue will immediately clear out and start looking for alternate NZB’s for failed downloads. Is this a bug or something?

That’s because your script is blocking Check for Finished Downloads from executing. That task is responsible for updating items from the queue, history is not tied to that.

Your custom script should fork to a new process and return early if you don’t want Sonarr to wait for it to finish.

But if the script forks to a new process, would things still be done sequentially (one after the other)?

Also, there’s a potential that I might encounter this issue again since the conversion is happening in Sonarr with my setup. Is the behavior of Sonarr changed since I posted that issue in order to address this?

You’d need to manage the order at that point as Sonarr would import then trigger the script.

That issue can happen regardless, importing completed downloads is not linked to refreshing series and rescanning the series folder. The recommendation in that thread to avoid it is still applicable.

What do you mean I need to manage the order if the script forks to a new process? I don’t understand what you’re trying to point out there. If the script forks to a new process, will Sonarr process multiple jobs sequentially or in parallel? By sequentially, I meant one after the other and not in order.

Yeah, my point here is that because the conversion is happening on Sonarr’s post-processing script then I might encounter this issue again. But so far, I have not. Sonarr handles the completed download as an mkv and then the post-processing script converts it to mp4 but it is still in monitored status. Out of curiosity though, why isn’t it happening to me anymore?

As soon as the script completes, which it’d do immediately if you spawned a new process and exited the script Sonarr called, Sonarr would proceed to the next item, so it’d be parallel.

If you’re changing the extension (which would cause Sonarr to lose track of the file) I’d expect it to think it’s missing as it should detect the original file was deleted before importing the next file (episode history should show a delete because it was removed from disk, not a delete for an upgrade).

Beyond that, luck to not run a refresh while the conversion is taking place.

Ok, that’s what I would expect too. I’m not sure how the script will behave in parallel though, so I’d have to ask the developer.

As for the extension changing, what happens if Sonarr refreshes after the conversion takes place? It would see a delete event for the original mkv file, yes, but will it also re-import the new mp4 file? If it does this, it will automatically unmonitor that specific episode as soon as I have that option enabled, correct? Here’s an example:

So for that delete event, I see this reason:

Sonarr was unable to find the file on disk so it was removed

That tells me that Sonarr already did a refresh and did not find that mkv file (because it was already converted to mp4) yet I still see it monitored. Why?

I’d need to see the debug logs that cover that series being refreshed to see. I’d expect it to be unmonitored given it was “deleted” (and why I suggested converting before Sonarr imported the file).

If I enable debug logging, will it show past events like this one? I guess not.

No, the log level doesn’t apply retroactively (otherwise you could just go look on disk).

Here are the debug logs:

The releases you’re looking for are:

Your script is telling Sonarr to re-monitor the episode after the refresh.

Your script:

  1. Script converts the file
  2. Deletes the source file
  3. Tells Sonarr to rescan
  4. Waits for the rescan
  5. Tells Sonarr to re-monitor the episode

Nice. That’s smart then. Thanks for confirming.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.

Sonarr version (exact version) : 2.0.0.5322
Mono version (if Sonarr is not running on Windows) : 5.20.1.19
OS : Linux Docker: https://github.com/mdhiggins/sonarr-sma
Description of issue :

I would like to continue this archived issue. So me and the sonarr-sma developer (mdhiggins) are trying to make the script work in the background to solve this task stuck issue. Our discussion about this starts from this post and basically we tried wrapping the post-conversion script to something like this:

#!/bin/sh
/usr/local/bin/sma/env/bin/python3 /usr/local/bin/sma/sickbeard_mp4_automator/postSonarr.py "$@" &

But it really didn;t work. Do you think any of the solutions here will work and if they make sense? I still have yet to try them but I figured you may have some suggestions or two.

I don’t have any suggestions on how to achieve that behaviour.

Ok, thanks.

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.