Assistance with script to replicate Drone Factory


So I’m keen to migrate from SickChill to Sonarr.

I do, however require the ability to be able to manually download episodes / seasons from time to time and have Sonarr auto move and rename them from a folder to the relevant season folder.

I realise that this was completed using Drone Factory in the past and that it can still be achieved using an auto executed script that runs periodically or after downloading.

Does anyone use a script for this currently and if so, can I get a copy of it rather than writing it myself?

Whilst I could likely figure out how to write it myself using guides etc (albeit over a long period of time!) I figured that others would already have a need for this same feature and perhaps I could get a copy of it and simply re map the folders / drives etc within it.

This is a bit of a sticking point for me to become a full blown Sonarr user and walking away from SickChill.

Thanks heaps.


Reckon I’ve found another way. The drone factory still works, I just need to turn it on if I source a whole season once I download it.

It should then scan the folder and move / rename the episode.

The only question I have is, how do I get the download status to update consistently in sonarr in this situation?

I tested it and it did it for a single episode for 1 show but for another episode, it did not do it.

Weird it did it for 1 but not the other.



Drone factory is gone in v3, it exists in v2, but it’s still recommended to not use it.

If Sonarr detects the download in the client; it needs to have the series added, the download needs to be in the correct category in the download client and be able to parse it then Sonarr can handle the import automatically, otherwise you can use Manual Import from Wanted: Missing.

Impossible to say why it detected one, but not the other without debug logs.


Thanks for the info.

I have the shows added, what do you mean by “needs to be in the correct category in the download client?”

This whole scenario for me would be limited as I rarely would need to re download older episodes or whole seasons, unless I’m getting on to a show that is more than 1 season old that is reccomended to me by someone.

I’ve also noticed that unlike SickChill, you cannot set an episode or season to monitored and have Sonarr automatically start searching for it like SickChill would if you set an episode or season to wanted (it’s called backlog search).

Sonarr requires a manual search for each episode, which isn’t really automation.

Unless I am misunderstanding how it works ? (Im still learning about it!:grin:)


In addition to avoiding conflicts it’ll allow Sonarr to track items it didn’t grab itself.

Sonarr doesn’t have automatic backlog searching, but you can tell Sonarr to search for episodes when you add a series or from Wanted: Missing. You shouldn’t need to do a backlog search unless your setup changed quite significantly or Sonarr was offline for a long period of time as Sonarr will be watching for missing/upgrades as part of it’s normal RSS Sync.

Make sure you’ve read the FAQ.


Thanks for the info.
I’ll Sus out the category settings g a little later today.

In v3, are you saying that there will be no way to dump an episode or group of episodes form a past series (one that I track in Sonarr and then source the episodes manually) and have Sonarr sort it out for me?

I realise the manual search option could facilitate this for me, however, I am keen to know if want Ive done in the past with SickChill, can be done in Sonarr.

In v3 with the absence of Drone Factory, will scripts for Drone Factory still work??
I.e. is it gone altogether or just missing from.the UI??



There will still be manual import available, but there won’t be a scheduled task that checks a folder on an interval and import things that Sonarr didn’t find in your download client.

This should cover pretty much everything:


That is actually acceptable to a degree for me.

I’ll check that out. I actually have v3 installed on my dad’s NAS as his is higher end, I’ll check out the options available there.

Thanks for the help!


FYI, this made zero difference. Where does it create the “TV” directory??

Incomplete, complete or the conf/torrents folder??

I cannot see it anywhere.



I don’t follow, which “TV” directory?


you create the category in SAB and then set sonarr to use that same category.

it should end up underneath the complete folder i think


Im using Transmission as my donwload client.


in that case transmission will just download it into the \complete\category_name_you_set_in_sonarr folder (which it will create)

if what im reading from markus is correct, then in transmission you would just set the download folder when manually adding a torrent to match what you set in sonarr and it should see it downloading, pick it up, and import it when complete

i did try that and sonarr does see it, unfortunately it just deletes it from transmission without actually importing it (theres no job in sonarr, just manually added to transmission)

i’ve never used it this way, as i didnt know it was possible but i wouldnt mind using it if it worked for when i have to manually download something sonarr wont allow to be downloaded in the search

small subset of trace logs included. i tired it twice on something that had already downloaded and it failed. so i removed the file, refreshed sonarr, then downloaded it again, same thing, it just removed the job from transmission and deleted the file (which may be part of the job removal, not sonarr specifically deleting it), didnt even try to import it

19-1-11 23:15:27.6|Trace|CommandQueueManager|Publishing CheckForFinishedDownload
19-1-11 23:15:27.6|Trace|CommandQueueManager|Checking if command is queued or started: CheckForFinishedDownload
19-1-11 23:15:27.6|Trace|CommandQueueManager|Inserting new command: CheckForFinishedDownload
19-1-11 23:15:27.7|Trace|CommandExecutor|CheckForFinishedDownloadCommand -> DownloadMonitoringService
19-1-11 23:15:27.7|Trace|CommandQueueManager|Marking command as started: CheckForFinishedDownload
19-1-11 23:15:27.8|Trace|HttpClient|Req: [POST] torrent-get(...)
19-1-11 23:15:27.8|Trace|ConfigService|Using default config value for 'proxyenabled' defaultValue:'False'
19-1-11 23:15:27.8|Trace|HttpClient|Res: [POST] 200.OK (9 ms)
19-1-11 23:15:27.8|Trace|HttpClient|Response content (385 bytes): {"arguments":{"torrents":[{"downloadDir":"/downloads/transmission/complete/sonarr","downloadedEver":207347317,"errorString":"","eta":-1,"hashString":"f95542f0975d837f4efc77a724b253b42ad3c3ed","id":39,"isFinished":true,"leftUntilDone":0,"name":"Gotham.S05E02.720p.WEB.x265-MiNX[eztv].mkv","seedRatioLimit":0,"status":0,"totalSize":193279135,"uploadedEver":448622}]},"result":"success"}

19-1-11 23:15:27.8|Debug|TrackedDownloadService|Tracking 'transmission:Gotham.S05E02.720p.WEB.x265-MiNX[eztv].mkv': ClientState=Completed SonarrStage=Imported Episode='Gotham - S05E02 WEBDL-720p v1' OutputPath=/downloads/transmission/complete/sonarr/Gotham.S05E02.720p.WEB.x265-MiNX[eztv].mkv.
19-1-11 23:15:27.8|Trace|EventAggregator|Publishing DownloadCompletedEvent
19-1-11 23:15:27.8|Trace|EventAggregator|DownloadCompletedEvent -> DownloadEventHub
19-1-11 23:15:27.9|Debug|DownloadEventHub|[Gotham.S05E02.720p.WEB.x265-MiNX[eztv].mkv] Removing download from transmission history
19-1-11 23:15:27.9|Trace|HttpClient|Req: [POST] torrent-remove(...)
19-1-11 23:15:27.9|Trace|ConfigService|Using default config value for 'proxyenabled' defaultValue:'False'
19-1-11 23:15:28.0|Trace|HttpClient|Res: [POST] 200.OK (124 ms)
19-1-11 23:15:28.0|Trace|HttpClient|Response content (36 bytes): {"arguments":{},"result":"success"}

Mono Version
OS Synology DSM 6.2 Docker linuxserver/sonarr


If Sonarr already imported a completed download (by Torrent Hash in torrent clients) then it won’t attempt to import it again (and when Remove Completed Downloads is enabled it’ll ask the client to remove it).

That looks like the case here.


ok, i tested with a new torrent (one i hadnt downloaded previously) and it works properly, so thats very handy

ok, what was the rationale behind that one? and is that newish as i thought i had re-downloaded stuff before without issues

if i lose some files and go to download them again its not going to import them, thats not really helpful.

i just tried that via the sonarr web gui as well (deleted the file, refresh the series, shows as empty, manual search, download the same torrent) and it will send the job to the download client but will not have one in sonarr so the file downloads but sonarr has no intention of importing it - why send it in the first place in that case?

is this optionable or hardcoded?


Not new. It’s so Sonarr doesn’t continually try to import something it’s already imported. It uses the download client’s ID to track that, which for torrents is the hash, in usenet clients a unique ID is given to each download so we use that and don’t have this issue.

It’s still the best release, Sonarr isn’t going to grab another lesser quality one to circumvent that (and checking that isn’t part of the download grabbing logic).

Definitely not an option, otherwise it’d import it every minute. We have plans to improve the behaviour at some point.


ok, im posting to here and the githb one, which one do you prefer to continue this in?

so this feature is to handle seeding (the job staying in the download client after its been imported), you dont want to continually try to import a seeding torrent so you flag the id as having been imported thus you do it once, flag it, and dont worry about it if you see it again? makes sense.

downside is youre keeping that status forever? so the issue ends up at a later date, and the download is no longer active, and you need to download the file again for sonarr to import

you could clear the imported flag for that id when the user clicks on the download button (manual or auto) in sonarrs web gui so that it will get imported again (i presume youre checking here for the id and not adding an activity to sonarr because it exists?) - that wont fix duplicate items manually added to the download client but it makes sonarr work how it should.

you could purge the import flag for the id after a set amount of time? 24 hours? user configurable per indexer? (probably better so you can match it to each indexers re-seed times). this would cover the majority of scenarios.

i just noticed theres already a re-seed time option for an indexer, could that be utilised for an id import timeout/purge?


I’ve responded on github, but I think this all has already been covered.

closed #19

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.