My goal is to have sonarr on my local machine use a remote downloader, and keep the files on the remote server for Plex which is also remote.
Here is my setup: Local Machine
-OS: Windows
-Running Sonarr
-Mounted remote drive via SSHFS Remote Machine
-OS: Linux
-Running SABnzbd
-Running Plex Media Server
Right now when I set a file to download, after it downloads Sonarr picks it up, renames it, and puts it back on the server for plex to read. Now the problem with that is, to my knowledge, the file must be downloaded to my local machine, renamed, and then uploaded again to the server. This take a very, very, very long time and uses unnecessary bandwidth. Iām looking for a better way to handle this.
I tried just having SAB rename the files and put them to their destination, but Sonarr doesnāt recognize that the file has been processed. Iāve thought about using the program Filebot, but havenāt seen anyone post anything here about that. If anyone has any advice Iād greatly appreciate it. Also Iām not able to install Sonarr on the remote machine (trust me I wish I could) due to it being a VPS and the host wonāt allow it.
If the source and destination are both on the same share it shouldnāt need to copy it locally and re-upload it, but it will across shares (same thing as copying to a different hard drive vs the same hard drive).
If SAB moves them it will take up to 12 hours before Sonarr sees that the files are there (Sonarr rescans the disk every 12 hours), you could also have a script to tell Sonarr to rescan that series using Sonarrās API.
Iāll have to look further into this, as that could definitely be the case using a VPS.
That is the setup Iām doing right now, but all the downloads are stuck in the activity profile. Even after a manual scan and sonarr picking up on the files, they are still in activity. Any idea on how to clear those?
The other thing I just though of is I use SAB to rename and move the files, and that happens almost instantly. If I understand correctly, if they were on separate shares wouldnāt it take just as long? By long I mean CDH takes about 2 hours per episode.
If you uses the shares with SAB then it would need to ācopyā (move between them), but on the same system with a fast network connection its not going to be much worse than copying to another local drive.
CDH takes much longer because it makes Sonarr the middle man, download the file then upload it back, so its limited by your network connection (and using a TON of bandwidth).
You actually confirmed that the move takes time? It quite possibly doesnāt.
Are you using a single sshfs mount or multiple?
For example, CIFS shares simply translate the local move to a server-side move, thus donāt consume bandwidth. Itās quite possible sshfs does the same. So check it.
The way I confirmed the move is I simply watched the folders. For example:
SAB moves download to /dir/download
Sonarr CHD renames and moves to /dir/plex
I simply watched to see when the renamed files showed up in /dir/plex. Everything did work without error, it just took hours.
Iām currently using a single sshfs mount, which is a linux mount on a windows machine. Iām not familiar with CIFS, so Iāll have to look into it. Thanks for the tip!
I have Sonarr/NZBGet/Plex/nzbToMedia on my local machine and a seedbox that I use ruTorrent on.
I donāt use SSHFS since it seems to have issues moving files and a timely fashion and the tools tend to time out if it takes 5-10 minutes to move shows. Instead, I use btsync between the machines and that drops files when completed. I run nzbToMedia which works great with Sonarr and Plex to update shows in the GUI.
Iāve been running this Sonarr setup for a few months now. Prior, I was running a SickRage/SickBeard setup for 8-10 months since btsync came out.
Well this is complicated haha. I know little to no python, so this will take me some time to figure out how to incorporate it. Thanks for the tip though!
Itās really not that complicated. I am not a programmer by any means and I just found nzbToMedia to have a little more robustness in the terms of transcoding media if you need that and handling btsync by default. The developer is very responsive like the Sonarr team is here. I was a bit intimidated at first, but after digging around asking a few questions, I couldnāt be happier with my setup these days.
To give a little more detail, I use Sonarr/CouchPotato as my āgrabbersā for my content.
nzbToMedia runs via cron every 5 minutes to check my Sonarr directory and my CouchPotato directories.
I use btsync to keep my TV/Movie ādoneā folder in sync and also my autowatch ruTorrent folder so things work automagically.
So once a torrent finishes, ruTorrent moves it to the āDoneā folder, btsync moves my files, nzbToMedia will wait until the sync is completed based on the knowing the sync file extensions are still in process. Once completed, it will use the API for Sonarr/Couchpotato and process the request so all your history/etc is done via the API and works like a champ.