Drone Factory is Deprecated [unrar]

Another user who relies on Drone Factory here but I won’t complain, just here to figure out how to work without it. Is there an example of how to call the API to trigger the DownloadedEpisodeScan?

In my setup, files are scp’d to a folder on my windows computer running Sonarr. I don’t have a download client running that can trigger anything, but I can use EventGhost to watch the directory and then do something… Does Sonarr happen to have an EventGhost plugin?

Here’s someone who had the same but then with BTSync: https://github.com/Sonarr/Sonarr/issues/2021

But I’d like to know a bit more about your setup, if there’s no download client then how’s Sonarr triggering downloads?

I haven’t heard about EventGhost before today, so I don’t know how it works exactly. I hope it doesn’t have the same disk scanning pitfalls that the Drone Factory has, otherwise it’s not an improvement.

This is why i use Drone Factory. I’d like to hear if there is another solution (other than the obvious one to let Sonarr deal with subtitles)

On my NAS I have two sonar instances running (one in docker) The first downloads everything and stores it in a generic directory. From there on I use an autosub program that finds subtitles. Sometimes the same day sometimes it takes longer.
Next there is a perl program that moves the episodes that have a subtitle to… the drone factory folder that the second instance of Sonarr scans periodically.
The second Sonarr therefore only caters for the correct placement.
As this process takes care of creating the correct folder, setting up jpg’s it would be a heck of a task to program this myself in perl.

As I use iFlicks to convert my video files, I was using the Drone Folder as the final destination for converted files. I can completely understand why the feature shouldn’t be used but in my scenario I need it. Ive got the DownloadEpisodesScan API working, so Iflicks is now triggering this. Works great.

My only issue now is that because I have “Complete Download Handling” off and no drone factory set, I’m getting the following health check warning “Enable Completed Download Handling or configure Drone factory”

Is there a way to switch this off??

@DekkersBloemen Use the API call to signal Sonarr to import it. Alternatively you could look into Custom Scripts, which can be triggered after Sonarr has imported it. You’ll have to find a way to deal with those ‘deferred’ ones, of course.

@dhcrees Which download client? Coz if you use sabznbd/nzbget then you can simply do the transcoding as pp-script. nzbToMedia for example does that, and completely compatible with Completed Download Handling, from Sonarr’s perspective it simply hasn’t finished downloading yet.

@Taloth can you recommend a SAB script to do the transcoding, I basically want all downloads converted into M4V. I looked at this before and didn’t have much luck with it, thats why I went down the iflicks route. Quickly looking at nzbToMedia it doesn’t appear to do transcoding, just renaming etc (but I might have missed something). Many thanks for your help.

@dhcrees nzbToMedia does it using ffprobe and ffmpeg. https://github.com/clinton-hall/nzbToMedia/wiki#transcoding-and-video-corruption-checks although what you likely need is a remux of the video and optional transcode for the audio. I’m assuming you want m4v because of apple devices.
There are other pp-scripts afaik that do the same.

Alternative is plex, which does remuxing/transcoding on the fly.

Thanks for the reply, I’ve had a better look at nzbToMedia and can see the transcoding options now. I’ve been using Plex for years, but transcoding everything to m4v first means I don’t have to rely on any performance issues with Plex doing transcoding as its running on a Synology NAS. Files in m4v format direct play on most Plex players whereas it needs to transcode MKV’s for example which the majority of TV shows are currently in.

Again thanks for your help, and I’ll have a look at the other suggestions you made.

Thanks for the answer. It does however seems to be a lot of extra work considering that I now have to set up folders with rather strict naming convention which surpasses the whole idea of using a second Sonarr instance. (I fear by now I hace to get rid of that alltogether)
The alternative seems to be that I trigger the manual import process by some api. Admittedly that is no longer manual by then but it has the advantage over drone factory that the command is issued by a program that could/should cater for other process that may be active. In my situation that would be nearing perfection.

Using the manualimport call now gives an error Unauthorized however. Is this intentional or do I not understand?

I meant the DownloadedEpisodesScan command not ManualImport:

HTTP POST with a json body: https://github.com/Sonarr/Sonarr/wiki/Command#downloadedepisodesscan

Something like

curl http://localhost:8989/api/command -X POST -d '{"name": "downloadedepisodesscan","path": "/serverA/var/temp/Download.S01E01","importMode":"Move","downloadClientId":"{infohash}"}' --header "X-Api-Key:MyKey"

It works per individual job/download, so you call it for every finished+transferred download in the download client.

Can you elaborate?

Since I have to wait for subtitles I can’t do this from the download client. Manual works fine when done manual. I use this from the commandline for testing. Once it works I’ll execute it in the perl script that stores files in the /media/downloads/klaar folder.
http://huisnas:32777/api/manualimport?folder=/media/downloads/klaar/?apikey=80ab6ffbdd414fac97d8a1bf6e1dada0

You don’t have to do it from the download client, call the DownloadedEpisodesScan command from the perl script, it’s intended for automatic workflows, manualimport is not.

I’ll give it a try.
Thanks

I typed up two paragraphs just now about how much this is going to hurt me… but I know you guys don’t care.

I know no one cares… nothing matters… not your problem blah blah blah… but I’ll I’m saying is it puts me one step closer to that edge.

Good luck drone factory folder users no one cares…

Reading the reddit post and there’s this comment… “Tbh, I’ve been considering adding a time delay to blackhole imports, preventing them from importing until x minutes passed since the download appeared/was last modified in the watch folder.” YOU GUYS LITERALLY ALREADY HAVE A FIX but instead are choosing to scrap something people need in order to use your software…

No one wants to manually import stuff the entire purpose behind sonarr is automation…

Last comment on the reddit post…

“I’ve never used drone factory for any Sonarr related, but I’ve found it to be the absolute simplest way to organize downloads from Playon as well as my OTA HdHomerun recordings. I know this wasn’t the intention, but it’s dead simple to set up and lets Sonarr handle the moving, organizing and renaming automagically for series that aren’t available online. I used DropIt to automatically move the completed files from their recording/download folders after a few minutes to DF and Sonarr would automagically take care of everything else.”

1 Like

I feel like you’re winding yourself up. Getting more fed up every post you read…

In regard to those comments: The Blackhole mechanism is distinctly different from the Drone Factory, in the sense that blackhole is an extension of CDH and drone factory is not. The Blackhole is considered a download client and has some protections that the Drone Factory has not, but it’s also stricter in what it can consume and in which format.
Also, that user from the reddit post you quoted has long since moved to CDH and is afaik quite happy.

Also, you’re not expected to ‘manually import’ stuff constantly since that would completely defeat the purpose of automation, Wanted->Manual Import is for rare incidental custom imports, something some users have used the Drone Factory for in the past.
I’d like to know from which post or wiki you have gotten the impression otherwise so I may try to make it clearer.

But please explain how your current workflow is setup, and we’ll be happy to propose the best approach in your situation.
I’ve yet to run into a situation where the Drone Factory was necessary.

PS: The fact that you found it necessary to edit your post to put bold emphasis on that one line is pretty much an insult, I know you just wanted to vent, but if we didn’t care you wouldn’t have Sonarr in the first place.

2 Likes

@Taloth In my setup I use Sonarr to trigger downloads from usenet, but not for torrents. For torrents I use irssi-autodl on a server, and when a show downloads that matches one that I watch, it gets scp’d to a folder on my computer running Sonarr.

As for EventGhost, it’s probably similar to Drone Factory. It watches a folder for events like DirectoryCreated, FileCreated, FileModified, etc. and you can set up event listeners to trigger actions when certain events happen. I used to use it at one point to trigger UpdateXBMC by starting a timer when FIleCreated, then resetting the timer to 10 seconds every time FileModified happened, and when the timer expires, triggering UpdateXBMC (for a file that’s slowly transferring, to be sure I don’t update before the file transfer finishes). I could use the same type of thing to trigger the DownloadedEpisodeScan.

Hi,

I’m also a drone user. Currently for sonarr, my seedbox hardlink downloaded files to a sync folder, that way I can keep all my seeding rules to my rutorrent while dispatch/delivery of the downloaded files are taken care of by a dedicated delivery pipeline.

This way its up to the software which requested the file to wait for the delivery at the corresponding local server. That’s the way it works for me :confused:
Due to the time it can takes to gather the downloaded files, to dispatch it to the right server, is it for work or not, and various rules in my scheduler and my cron tab, rutorrent is totally asynchronous from the delivery of the requested files.

So I can’t simply execute a script at the end of a download, I should do it once the file is delivered … so yes I can create a container dedicated to this task, because I don’t specifically want to modify my crontab on my main server just that, but keeping and alternative is always good, just in case.

I understand that letting sonarr managing the whole end-to-end process is elegant, but a scheduled local scan folder is always useful.

As stated by others above, I truly enjoy and rely upon the functionality of Sonarr to make my life easier. I’ve spent numerous hours setting up, testing, editing, testing, re-editing, googling, posting, thanking, and re-editing program settings to make everything work between usenet and torrent clients to insure that everything is handled correctly and all programs behave as intended. The one and only feature that allowed this this to happen in my setup was Drone Factory extracting rar files. I’ve yet to see a guide that doesn’t require me to learn how to create/program a script (something I’m completely unfamiliar with) to replace the functionality of one simple option in your program that you’re all steadfast on deleting.

I know, I get it, it’s your baby and you’ll do with it as you please. But, why? Why would you not listen to the people who pleading with you to please add .rar extraction PRIOR to making this decision so that we don’t have to be left in the cold and look for alternatives?

While it may be simple to folks who CAN program in .net to whip up a script like it’s nothing, it’s not to people who rely on the good will and time of internet forum posters who reply to questions, or who take their time to write guides and how-to’s so that we have an inkling of a chance.

I read comments above by sonarr badged posters that seem somewhat defensive, perhaps even at times, condescending, towards the people who are asking for help. I apologize in advance if those two words put anyone on the defensive, as they’re not meant as an affront. However, I believe you folks should understand that your responses can sometimes come to be taken as such, whether intended to or not. Not every person here, asking for help, is as technically verse as you. If we were, I’m sure we’d have written our own programs/apps/scripts/solutions and now bother you fine folks when our beloved program gives us warning errors about a vital option that we don’t have a readily available solution for, and the developers state that it’s coming, but not before we lose functionality we rely upon.

I certainly hope I’ve imparted some of the concerns that I have and that I’m not looking to spend the next few weekends searching for alternatives to the current automated setup I have (especially with summer TV season about to start!) I also hope that none of the words I’ve said come across as hostile, as I did not intend them to. I merely hope to highlight the concern I have, and what I believe others have as well from reading prior comments.

Have a great weekend!

1 Like

As the dude who started this thread, I’d personally like to give a HUGE hand to @Taloth and @markus101 .

Let’s not forget this is OSS, and free. Let’s not be entitled. First world problems, right?

The simple bottom line is that there is NO alternative to automatically un-rarring for windows users except if you utilize the drone factory. However, I am looking into other options, but it’s not looking good right now.

Also - @Taloth - Simple question - WHY can’t Sonarr unrar? It’s based on python right? Not to bring up our ugly step children, but CP is also Python based and did a pretty good job at it (even if it was the only thing it did well)?

I’m a sysadmin by trade, so I will never try to armchair QB those in the trenches writing this code.

Also, I’m going to re-start this discussion over on the thread I started a while back. Please post your suggestions as far as getting this done there:

Your setup sounds quite sophisticated, so it’s a tough to completely recommend a solution, but a Remote Path Mapping may be viable to achieve the desired result. The remote path mapping would remap the path that rTorrent reports to something locally accessible by Sonarr, your scripts would still need to transfer the files, but would need to keep the filenames and the same folder structure in place (if it isn’t already) and to ensure partial imports don’t occur you’d want to transfer to a separate folder and then move to the remote path mapping location.

Sonarr is written in C#, no python here.

The simple answer is when Sonarr was started the download clients did it, both SABnzbd and NZBGet handle it seamlessly. Torrent clients on the other hand often don’t which from my POV is a big downside, not to mention the ones that have plugins to do it have no post processing stage, so in that respect they are inferior as well, since Sonarr treats complete = ready to import.

We still have plans to support unRARing, but being effective when unRARing (and checking media info in the the process) makes it non-trivial. There is tons to do, and unRARing isn’t at the top of the list currently.