Import without Drone Factory [Google Drive & Local server]"

**Sonarr version 2.0.0.4855
Mono version (if Sonarr is not running on Windows):
OS: Windows 10 & Ubuntu

Hey guys I’m just looking for help with my workflow now Drone Factory is gone.

My set up is
Sonarr running on a Hetzner server (Ubuntu) AND also locally on W10
Ok Sonarr on the Hetzner points directly to Google drive and I have zero issues. I grab 1080p HDTV or WEBDL.
Deluge on the Hetzner moves the torrent to a completed directory and Sonarr does its stuff via CDH and sends it to GDrive using Rclone mount for write and PlexDrive for read.

Unfortunately that is really just a secondary Plex server. My main server is at home and I have always had an FTP sync program that monitors the completed torrent directory and syncs the files home (it uses temp files until the transfer is complete so no issues with processing partials). Drone factory on My home setup (an exact duplicate of the one on the Hetzner) would just simply rename and import the files into my home media directories and if it was a quality upgrade replace the lower quality file.
Basically one snatch of a torrent kept GDrive AND my local server up to date.

EDIT… At this point you may wanna skip to my second post because I’m hoping I found my own solution.:astonished:

Right now I’m lost without drone factory.
I would have no issues with changing the setup on the Hetzner and writing to it locally and then doing an rclone copy to GDrive
I also know about pointing my local install to Deluge on the Hetzner and using remote path mapping but don’t see how it will help in this case, If I’m not mistaken the renaming only happens when the files arrive back on my local server. So there’s no renamed files to send to GDrive.

So right now all I can think of is having two separate torrent clients (be they both remote or remote & local) having to snatch duplicate torrents.
OR totally abandon Sonarr on my local machine and set up Sickrage locally and have it do what drone factory used to do so well.
OR continually have to manually import stuff when the FTP sync program sends the files to my local server.
All pretty lame options.

I know I cant be the only Plex user with a mirrored local and cloud PMS.

EDIT I have just been reading this page

To be honest it may as well be Chinese for someone who has no idea about scripts.
But maybe there’s hope for me there somewhere?
Also the Sync program I use offers this on sync completion. I wonder if there’s anything here I could fill in here to trigger post processing?

Thanks for any help you can offer guys.

Have been thinking a little more about this and wonder if this scenario would work as I think I have just read Markus in another thread saying that remote path mapping doesn’t physically look at the remote filesytem ?
So…Local Sonarr install pointing to remote deluge and does the episode grabs and assigns the tv-sonarr label . Remote path mapping waits for the files to arrive locally via the FTP sync program and CDH kicks in.

Meanwhile remote Sonarr install doesn’t actually search for anything because deluge IS set up but no trackers are enabled. It just picks up the tv-sonarr label sent to deluge by the local install and CDH kicks in on the remote Sonarr install and sends to GDrive? This all bearing in mind they are basically duplicate servers grabbing the same content.
Seems pretty obvious now, so I’m sure someone will tell me I missed something and it won’t work. :smiley:

I’ve some questions about the workflow.

But first a recap, correct me if anything is wrong:

Hetzner ubuntu server:

  • Sonarr
  • Deluge download client
  • Series Folders is on a ‘mounted’ storage using PlexDrive+RClone with GDrive backend

Home Win10 setup:

  • Sonarr (without Download Client/Indexers?)
  • ftp sync all the output from the Hetzner deluge client to the local system
  • Series Folders is on local storage
  • Plex Server (main plex instance, not using cloud drive)

Questions:

  1. Is there a difference between the local & remote Sonarr instances in terms of series and configuration?
  2. Is the home & gdrive storage indeed exactly the same? Or do you delete content locally after being watched?
  3. Why the GDrive setup if you have enough local storage? My bet would be on bandwidth so you’re using PlexCloud to serve outside of your home network.

Assuming 1=no, 2=yes,no, 3=bandwidth:

Why double Sonarr’s? Why don’t you keep a home copy of the GDrive? That would get you a hetzner server that does all the hard labour, which would have a single Sonarr instance to manage. With simply a sync of the series folders to the home plex server storage.

Hi Taloth thanks for the reply.
Question 1,2 & 3… You totally nailed the answers.
I never delete anything off either server and the servers are completely duplicated and will be going forward.
Plex in the cloud is to serve to some friends, some of which are allowed to access my home server, others aren’t because bandwidth is indeed an issue. Plus of course for those with access to my server they have the Cloud server as backup if my server goes down whilst at work. (The trakt.tv plugin in Plex keeps them up to date.)
The remote install is indeed mounted. An rclone mount for write access (Sonarr, Radarr etc) and PlexDrive for read access (Plex) using this tutorial. To add I have never had a single google ban since using it.
https://techperplexed.blogspot.co.uk/search/label/infinite%20plex

The double Sonarr’s are purely to ensure that all my TV Shows are up to date on both servers. I’m OCD :grinning:
Plus syncing to the home directories will take not care of quality upgrades and leave two copies on my local server until I manually delete the lower quality … I know cos that’s how I used to do it.

So going back to my second post should that work in theory? As long as I have Deluge enabled (but no trackers enabled) in Sonarr on the Hetzner, the remote Sonarr install will kick in CDH just based on the fact that it sees the tv-sonarr label my home setup is sending to that remote Deluge? Seems kind of like manually adding a torrent to deluge with the tv-sonarr label which I know will work unless it fails to parse the file.

Why wouldn’t syncing to home take care of upgrades? a proper mirror removes deleted files too. Might just be a matter of picking the right cmdline options.
Having two Sonarr’s is more prone to errors and timing issues, so from an ‘OCD’ perspective it’s not a good option.

As for your second post, tbh I think you’ll have quite a bit of timing issues. Will the source file be deleted if Sonarr A imports it or Sonarr B? It’s like having two stock brokers, you send the same purchase order to both and hope they make the same choice at the same time… it’s wishful thinking.
And imho it’s a pain to keep both Sonarr’s in sync. So no, I don’t think it’s a good option.

I’d really advice you to look into mirroring options. It has the added advantage that it is self-correcting: If something happens preventing it to sync then it can automatically correct that next time it tries. But the key here is to use mirroring (full sync) options instead of copy. For example: https://rclone.org/commands/rclone_sync/
If you use that then you have:

Hetzner ubuntu server:

  • Sonarr
  • Deluge download client
  • Series Folders is on a ‘mounted’ storage using PlexDrive+RClone with GDrive backend

Home Win10 setup:

  • Plex Server (main plex instance, not using cloud drive)
  • Series Folders is on local storage
  • rclone sync full sync from gdrive (or hetzner ftp) to local storage

Ok Taloth thanks.
I will take a look into the options. Which yeah I understand now.
I’m guessing that for the sync to work it would need me to ditch having quality in the file name.
So Twin Peaks - S03E01 - The Return, Part 1 - WEBDL-1080p.mkv
would stay behind when it upgrades to
Twin Peaks - S03E01 - The Return, Part 1 - BluRay-1080p.mkv
I could live with Twin Peaks - S03E01 - The Return, Part 1.mkv
Not a big deal I guess, if everything else fails.
All that said I decided to test my other theory and with careful selection of where to use the hardlink/copy option, it has successfully sent the first five episodes to both my local system and Google.
I will monitor it over the next few days.

I appreciate your time. :smiley:

No you don’t. I think you have a misunderstanding how mirror syncs work: They delete files in the destination that no longer exist in the source. So the original file after an upgrade would not be left behind.
rclone sync handles additions, modifications AND removals, see the rclone sync link I pasted earlier:

Sync the source to the destination, changing the destination only. Doesn’t transfer unchanged files, testing by size and modification time or MD5SUM. Destination is updated to match source, including deleting files if necessary.

Ultimately it’s your decision on which solution you use, so do what you wish. I just think you’re making it unnecessarily complicated and, as a result, error prone.

OK the more I thought about this the more it seemed like a more sane approach. I just did some testing with rclone sync --dry-run and them some test files.
Sadly however I see what you mean now when you asked if the two server are “really” identical. The media is the same but I cannot be 100% that all the file names are consistent and have dashes in the same places etc…on each server and with 40TB of media on each server its probably not gonna happen.
Then to cap that sonarr copes well writing to GDrive with the odd episodes snatched each night. Sadly it seems to crash when its sending up a 30GB BluRay season pack so probably an rclone mount issue.
I may abandon the mount idea for writing and just use the hetzner storage for renaming and rclone copy or move for uploading to google.

I guess I can do a script to automate it and it wont cater to my OCD not knowing there are no missing episodes on GDrive but what the hell lifes too short and it is only a secondary server.

Thanks again for trying to help.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.