Is Sonarr OK with a longer delay between completion of torrent and completed files existing (or being partially sync'd) in the configured remote file path?

Sonarr version (exact version):
Mono version (if Sonarr is not running on Windows): 5.16.0
OS: Unraid
Debug logs: n/a
Description of issue:

I’m setting up Sonarr on local server, queuing downloads on seedbox.

Originally I was trying to use SSHFS to mount my seedbox to a local mount and use the Remote File Paths feature in Sonarr. This worked, but was super slow because SSHFS is crazy slow for me. I’ve spent last 2 days trying to speed it up and now just giving up even though I liked the simplicity.

Instead, I was looking to do a different approach and wondering how Sonarr works with Remote File Paths and if what i want to try is possible.

I was thinking of doing this:

  1. Sonarr on local server, still queuing downloads on seedbox
  2. After torrents are complete, seedbox hard links files to a sync folder
  3. Local server then mirrors that seedbox sync folder locally (e.g. using LFTP, rclone, etc)
  4. Sonarr then looks locally in that local sync folder for importing.

So essentially I’m using LFTP to sync folders locally instead of mounting the seedbox directly. Since I’d be sync’ing, the availability of the torrent after completion in the local sync folder wouldn’t be immediate (need to wait for LFTP/rclone to finish).

Will the Sonarr Remote File Paths feature work for me in this case? Will the delay in availability be a problem with the completed file handling feature with remote file paths?

IOW: Do I have any issue with partially completed files? Meaning, if it’s a 10GB file, it may take say 10 mins for rclone to sync it locally. During that time, if Sonarr looks in the remotely configured path, it will see a partial file.

A longer delay is fine, but you don’t want partial files in the folder Sonarr is checking, it could lead to a partial file being imported or missing additional files because it imported the first available file. Instead you should transfer to another location on disk and move the files once everything has been transferred.

This is a pain to deal with because I now have to not only sync files from Seedbox to local server, but also have to monitor that job, then make them available in a secondary local folder. This isn’t horrible and I can do it.

But this raises another problem: I don’t think there is a way to have Sonarr move files from the Remote File Path. The only option is to delete/remove from the Torrent client, which isn’t what I want since I want it to still seed on my seedbox.


It seems really common for people to want to have local instance of Sonarr, remote downloads on seedbox and getting files back to local server to be included in media library. I’m surprised that this scenario isn’t well support, not necc from Sonarr, but a solution solved by someone with third party components in a more streamlined way.

Ideally you would let your download client handle the move as well, and then report the download as completed to sonarr.

E.g. some clients allow you to execute a script when a file is downloaded and wait for the execution of this script to complete, before marking it as complete.

My download client is rTorrent. Since I have sonarr integrated directly with it for download completion and I was mourning my downloads folder from my seed box as a local mount, I stopped moving/hard linking completed downloads to another folder on my seedbox since it wasn’t necc.

But now it looks like I need to do that again.

Even if My torrent client could run a script after download completion I don’t think it’s helpful because I want my local server to be the one to ping/query/hit my seedbox not the other way around.

I think the process needs to be this now:

  1. Sonarr local and queues downloads to seedbox rTorrent which downloads into /home/user/downloads.
  2. Sonarr is configured with remote file paths to looks at a local folder (call it syncFolder1)
  3. rTorrent finishes download and hard links completed downloads to /home/user/downloads-completed
  4. My local box periodically runs a script that uses LFTP to sync the completed downloads folder to a local sync folder (call it syncFolder2). It removes downloads from the completed downloads folder (/home/user/downloads-completed) on seedbox after its done (which won’t affect the torrents seeding because they were hard links)
  5. To avoid any problems with partial downloads, then have another script that looks in syncFolder2 to ensure LFTP sync was completed and move the files to syncFolder1.
  6. Sonarr is monitoring syncFolder1 because of my remote file paths config and then imports.
  7. I then have to have another script/job that cleans up syncFolder1 since Sonarr has no option to move (or copy and delete) from the remote file path it’s watching without also deleting the torrent from the torrent client.

This sounds complex (which it is) but I don’t know at all if it’s avoidable. Again I’m shocked that this isn’t a more commonly solved and supported scenario since I read so many mentions of this setup but with little detail.

Note: for step 5 above, some people say it’s not necessary but I think they’ve just been lucky given the way sonarr is designed right now. That is, unless something has changed in sonarr to make some of my steps unnecessary.

Would love suggestions if I’m wrong with any of above.

Tbh that sounds very complex. You might as well forget about configuring rtorrent as rtorrent in sonarr, use the torrentblackhole “client” and script everything related to moving files.

Now you’re querying rtorrent for status, but there is no relation between “done” and “files are available where rtorrent says they are” depending on when your scripts run etc…

Just my 2 cents :wink:

Agreed what I proposed is complex, but I don’t see any way around it at all. If there are better suggestions, I’m open to it.

One simplification I can see is to disable completed download handling in Sonarr completely, and instead change steps 5-7, to have the LFTP sync script then call into Sonarr API to trigger an import. Although I’m not sure if that’s possible, so need to do some sleuthing.

Again, I’m not married to this idea, I’m open and want better suggestions.

Definitely possible, that’s a valid alternative, but not sure that’s really any different than having the script just move the files after the sync.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.