Sonarr multiple downloads of same show same quality

I upgraded to the beta version of sab, it now includes an option to deobfuscate the files. Checked that, hope that fixes this moving forward.
2) Access issues you pointed out earlier (unable to access z:…) I don’t understand, but something changed earlier this year with the root folders, and root mappings the difference confused me for the longest time, it was a much of trial and error. Maybe what I ended up with, despite it not throwing errors/self-check / errors in the log file and it working, as far as I could tell anyway with exceptions. When I went back, it seemed odd and I hadn’t charged it in well over a year and it only had errors in the log, but again, the errors in the log were a surprise to me because I combed through these dozens of times a year to eliminate them. This year was different in that it had worked for years, maybe when I switched to v3 when it was first released.

My feedback after googling for proper settings for windows is there are lots of threads asking for help on this and still a few clear examples. It could use more detail on the faq with screenshot examples, especially for when you use external storage with descriptions for each.

I wrote a lot earlier, could you respond to those questions/suggestions? If my suggestions are too much, not helpful, let me know I’ll stop. Like the screenshot above, how should it look with season import if names we’re correct

For the love of God.
OK, since I’m too dumb to figure out something so simple, please can you give me the exact settings I need for the following…
NOTE: before answering this, continue reading, I think I found the source of my problem and how you can fix it for others…

Before the detail, if sonarr can import the files manually, then it has access. Right?
So if one gets an access error (which still isn’t showing in the app, only the logs) it has to be an intermittent problem, like antivirus locking the file (disabled, still happens), network issues verified working… strange it updates dates on files, can manually import, but downloaded content fails. what else can I check or try changing? I am using hard links, so shouldn’t matter anyway…

Vm running all software, host machine running plex with a shared drive. On the vm, sonarr is set up to run as a windows service, with an account with full access to share, and a folder with inherited permissions. some files seemed to import, but can’t find a log file to prove it is.

Ok, I see what happened! At some point (maybe v3?) for the mappings section, it no longer is text entry for remote path mappings. It had an entry, the server name, but that server name was NOT in the drop-down list that replaced this previous ‘enter name’ entry. Once I selected the IP address for the host in the drop-down, I no longer see the error messages in the log. So when did this change, and why didn’t it migrate properly? here is an opportunity to improve user experience!
the host is a drop-down list now, I didn’t touch this as the original string listed there was correct - but not an option I COULD select manually… strange!

image

btw - Why are you doing a netsh dump ? I would never want to post the results online…

Finally, here is an updated log file. I see a lot of repeated tries to import a file… which is the situation where a file is downloaded, won’t import, so as you described before, it would try to download another version of the file depending on the history kept on SAB, among other reasons. https://paste.drhack.net/?037dd0bc29c0c570#6Buv4yCBkbtmjfcSPRrsmK5UKfJfnamjtgBjyh8dRt48

There is no migration, because Sonarr doesn’t know what the host is supposed to be if it’s entered wrong.

It was not correct, It needs to match your download client’s host field, we now show the options for download client’s host fields instead of relying on users to enter it properly, of course if the mapping used to work and now doesn’t it’s because you changed the value of the host field for the download client and didn’t update the mapping, you still need to fix that manually.

Huh? It’s just what you’ve entered in the download client’s host field.

Are things properly importing now?
The obfuscated names being parsed (even though they shouldn’t) is something we have an issue open for.

Attempting to parse episode info using directory and file names. The.Pursuit.Of.Happyness.2006.1080p.BluRay.x264-SUNSPOT

Sonarr shouldn’t be attempting to process that, use categories to separate TV shows and everything else, that will also allow Sonarr to only get TV shows it needs to import (meaning something else downloading won’t bump it from the history).

Please take a moment to think, not react. I’m frustrated to hell because I’m trying to help you see a truth, but I’m getting pushback. So forgive my frustrations below, but sometimes it is necessary to call it out.

I’m making slow progress… But why is it even with photos, documentation, descriptions, responses I get equate to 'the problem isn’t the software, it is you and you inability to follow directions that are clearly posted everywhere - in get words, it is nearly always the user’s fault - fucking up instead of him open to facts that might mean you need to double check the logic /log vs true bug?? I’m telling you sir, in clear terms,

This was setup and working with no bugs in the logs. I have old logs I can send you to show you.

The only thing that was different, I had THE NAME OF THE SERVER (HOST) AND NOW YOUR DROP DOWN USERS IP ADDRESS.

Why do you skip my questions??? For the love of God, if this is “set wrong by the user” how the hell has manual import worked just fine for files got the last 6 months? I never copied anything over manually. You implying that I set this wrong

I get you don’t want to believe someone else actually cares enough about your baby to help you see, but that’s truth.

Please double check the code, or paste it here so I can review it myself if you inșist it my fault. Include the part that changed with the drop down. You added it for a reason - you said users kept setting it wrong… Maybe it wasn’t the users setting it wrong… Ahem.

Thank you, god bless your hard work. I mean it

? I’m confused. I asked what date did you change sonarr to a drop down list. Here are screen shots of what I could see…

As you can see, the host name is there, why would you not flag this if what is entered does not match the drop down? That would have saved you and I a ton of troubleshooting,posts, etc. You disagree this would have helped you and I at least? All I’m asking for is to a knowledge yes. If you say, hey Coco I honestly don’t think this would help anyone else or we have time then that’s cool! Only you would know.

To be clear, I had localhost, your list was IP based. That was the only difference. Here is a resolution to prove it was right… (note - Sorry, I didn’t realize uploading files it inserts at the bottom.at least I now understand to highlight text to get quote option)

Here is snip of the log file that I was concerned about. I am guessing this is how you populate the drop down list discussed above…

Yes, it fixed the problem. That’s why I’m like, eh, can you add a flag in UI so users like me know to update it?

I am using categories, here you go to prove it( btw, tell me what I need to check if not these things)



Ok, so I got it working on Sonarr, aI is running the new radar, and it has the SAME problem I had with Sonarr. Here are screenshots of what IT shows, which is exactly what I had to do with sonarr

Further, here are screenshots of how things were set, and what I had to do to FIX the drop-down BUG.

Here is the one that USED to work fine:

Do you see it was LOCALHOST? In order to change this to the IP address given this code change, I have to edit it, and select the WRONG IP address that is my only option beside itself:

10-3-host-BUG2

I have the share on the same server as PLEX, and your drop-down ONLY shows the VM Ip address. I can’t even select the right server?!? arrg. after a ton of trial-error, sonarr finally showed the IP address of the correct server, and I could finally select it.


k,I had to reboot my system to apply patches. when it came back up, I have the error with unable to reach z drive root. &#@&$#%% see [https://paste.drhack.net/?625126f95b7b2e9f#H3ErHxbiGBYibWFDfBvyqzettVwhb7aY4r1ePd7dLLhE](https://paste.drhack.net/?625126f95b7b2e9f#H3ErHxbiGBYibWFDfBvyqzettVwhb7aY4r1ePd7dLLhE

I went to check the setting, and it changed on its own… in fact, ut is completely hosed now the drop-down options do not include the proper server or IP address! I’m dead in the water now. your code us broken… it looks like the software is looking for where the clients to download are running… but that’s pointless if I’m using a network drive, nas, etc, that won’t ever get listed. I have no idea how I got it working before.


10-3-host-BUG2

it needs to point to liquiddozen, .171 as host ( server where physical files stored)

now what?

btw, thanks for the abc xyz fix deployment.

root folder newe

above is when I was able to get it working, the log before when you asked if it was fixed… what gives?

thanks for getting this working.

additional edit:

here is what I get trying to create a new root translation

No, it doesn’t use anything specific, it displays the values you’ve entered in the Host field for your various download clients, before it was a simple text box that you could enter anything you want. Go look at the host fields for your download clients.

Because with manual import you’re selecting the folder to import from, it doesn’t care that your remote path mapping is wrong.

It didn’t suddenly stop working for you without a change on your end, it’s working for thousands of other people, including myself. Sonarr is open source, https://github.com/Sonarr/Sonarr feel free to go review.

You’re proving the reason the dropdown was added, how would Sonarr be setting it wrong, when the drop down was added nothing was converted, if the value there doesn’t match the host field for your download client(s) either the values never matched or those fields were changed at some point (or the mapping never worked).

I’m telling you what’s wrong, I don’t know why it’s wrong, it just is, whether it worked once upon a time or not is irrelevant, it doesn’t work now and never would have with the current configuration. If you want to actually fix the problem select the correct value, if you don’t then leave it as is and know it won’t work.

When it was changed doesn’t matter, changing to the drop down didn’t edit any settings, it just enforces the value going forward. You’re welcome to dig through the git history if you need the exact date for some reason.

The values for the dropdown and determined when you open the modal, but the help text right below it is the same as before, if it doesn’t match the host field for the download client it won’t work and never would have.

Again. it’s not IP based, it just pulls the value from the Host field for the download clients.

Again converting it to a drop down changed nothing, 192.168.1.162 does not equal localhost (string comparison, no DNS or rDNS involved. SAB having an IP address set for Host fieldand Qbit usinglocalhost` means only Qbit would have worked with the Remote Path Mapping.

You’re not understanding how remote path mappings work. A remote path mapping tells Sonarr which download client(s) to map to by matching the Host fields (string based case-sensitive match), when the Host fields match it does a string replace on the path the download client returned, using the remote path as the part that gets replaced with the local path.

In your case your download client should be saying the files are in Z:\2Finished\, but Sonarr needs to look at the \\liquiddozen\media\2Finished path to do the import.

If you go edit the SABnzbd download client in Sonarr you’ll see that it’s Host field is set to the 192.168.1.162 and your Remote Path Mapping needs to match that…

I’ve explained what the issue is and how to fix it several times, both in this reply and my previous ones, once the issue was understood (multiple downloads is a symptom of things not importing). Please make the appropriate changes and ensure the Host field for SAB matches the Host field for the RPM. From there make sure Remote Path matches the common part of the path Sonarr is receiving from SAB and the Local Path is how Sonarr can find those files.

Thank you for explaining the ‘hosts’ for the drop-down list for Remote Path Mappings are pulled from the list of clients that are used to Download. I clearly see the problem, I hope to help you see it too! I feel like a complete failure because I fail to have ya see the logic error.

Any user with INCOMING/READY TO IMPORT stored on NAS or alternate server will fail. Why would you force them to only be able to pull from machines running the download client? I am sure it seemed like a good idea at the time, but for folks like myself with STORAGE LIMITATIONS vs. speed would store their incoming remotely. I could also solve this problem by moving my incoming to the local VM, but that limits me to the 60gig drive the VM runs on.

To prove I am correct: I installed bittorrent on my host machine and create a link to is so that that hostname/IP is available. that makes it work!

Ok for my clarification, the HOST listed in the Remote Path Mappings, it needs to be the SERVER that physically stores the files (YES/NO).
This can be a drive on the machine where SONARR is running, OR a Another Machine, OR a NAS/SAN/DAS drive OR a Cloud drive.

The whole point is to know how to access these files, and since we need to use Services in windows, it requires a UNC path (\servername\shareName) and NOT a mapped drive since the service accounts have no “logged on user with mapped drives” so we need a mapping. Great!!! Remote path mapping to save the day!!

*stored files will ALWAYS be on server that is listed in the Download Clients - WRONG *
examples: NAS storage for the incoming directory!!

An IRK of mine is “it works for me and everyone else” is not justification! It doesn’t mean “the only jerk who reported this is wrong.”

I bet you $100 every single person with the following setup (most specifically incoming directory on NAS or remote server has this issue. )

Reach out to or better yet, TEST this scenario (i am giving you every step I can so show what I am doing…)

  1. Use NAS box or a share on another completer shared on the network - meaning the download client (SAB) downloads TO NAS.
  2. Has v3.0.3.938. or newer and ideally is using hydra2 with USENET with a WINDOWS SERVICE account starting the service.
  3. Have them clear their logs, restart the service
  4. Have them download a show manually
  5. Once the download competes I guarantee, based on everything you have told me thus far, they HAVE ERRORS and the downloads are NOT importing automatically.
  6. Get a copy of the logs and review

You asked why “everyone else is working?” it is because the setup above is not likely used by the majority of users, especially once you factor what percent of users are:

  1. Windows
  2. using v3 test
  3. have a service account set up with an NT userID that MUST be created on BOTH the Sonarr Machine AND the NAS/computer hosting the files. (note: if there is an easier way that bypasses all these issues but still stores INCOMING/FINISHED/Plex share on NAS/another PC - I am all ears)
    we are already probably down to .5% of users
  4. Trace Logs are ON (0.000000001%) or Debug Logs are ON (0.000001%) (regular logs do not say WARNING or ERROR)
  5. The UI does NOT FLAG this error in any way. they only get imports that they have to manually import. .001%? This is the icing on the cake.

please please please test this yourself!

Here is my proof:

so I was getting these errors this morning (full log):
“20-9-18 19:56:24.0|Error|DownloadedEpisodesImportService|Import failed, path does not exist or is not accessible by Sonarr: Z:\2Finished\Galactica.80.S01E06.1080p.BluRay.x264-PSYCHD. Unable to find a volume mounted for the path. If you’re using a mapped network drive see the FAQ for more info”

During the time of this log, I added a FAKE Download Client that is disabled.
image
Now I can select 192.168.1.171. And you can see in the log that towards the end it works again.

I went ahead and stoped Sonarr, cleared all logs, restarted and here is the result:

I can’t imagine how to explain it further. If there is something that I could do differently that will work with my requirements, I am all ears!

Huh? I don’t follow, we’re not “locking” anything.

What do you mean how does it work? Lots of people use a NAS and Sonarr not on the same machine as the download client.

No, I never said that, I said fix your report path mapping. You don’t have to hack anything.

Show me a screenshot of Sonarr’s settings for SABnzbd, where it’s using a hostname, not the IP address like remote path mapping shows.


Both host fields need to be the same, doesn’t matter if you think they should be different, they need to be the exact same.

somehow deleted this post. arg!

Again, the Host field isn’t telling Sonarr which host the files are on (it never has), it’s telling Sonarr which download client(s) to apply that mapping to. That coupled with the remote path matching will replace that portion of the path with the local path.

You need to select 192.168.1.162 as the Host, so it matches the host in Sonarr’s SAB settings.

The help text for each of these fields (briefly) explain what they’re used for.

After you make that change, clear your log files, go to Activity: Queue and refresh the queue, after a few minutes, check the queue again, if your remote and local paths in the remote path mapping are correct things should import, if not then post the logs of the failing imports and hover over the icon in Activity Queue and capture the error.

At this point you need to stop arguing that what I’m telling you to do is wrong and stop ignoring the solution that is being presented to you, if you can’t do that I can’t help you.

[quote=“markus101, post:30, topic:26556”]
Again, the Host field isn’t telling Sonarr which host the files are on (it never has), it’s telling Sonarr which download client(s) to apply that mapping to. That coupled with the remote path matching will replace that portion of the path with the local path.
/quote]

I’m defeated. I promise you: i have done the things you have asked. When it didn’t go as expected, I tried to keep it moveing forward - thats all. I will only do exactly as you say and will document as I go to prove I am (and have) done what you asked. My mistake was trying to help get us there quicker by finding a fix when what you asked me to do didn’t work as expected.

I removed all remote mappings, added the remote mapping as specified.
Here are logs.

image

Sonar.txt

Sonar.trace.3.txt

sonarr.trace.2

sonarr.trace.1

sonarr.trace.0

sonarr.trace

I had a glitch that stopped me from gathering the two other items you requested. So I will start over and do it all again. stand by for another attempt.

The glitch is from the app. It is actually deleting the finished directory. I recreated the steps and it deleted it again. Might be why I get the error unable to access when set to. 162, which is the IP of the same machine the app is running sonarr, sab, and qbitt. Could you see why I have a hard time accepting (edit: the app would need to know localhost to work?) Needing to know where Z is mapped makes sense, hence the other IP.

I will upload a video of all the steps I took, as per your directions, that lines up with the timestamps in the logs above.

I can do it a third time, running proc (shows process access to files like forensics) to prove sonarr deleted it and not something else, plus I have found a way to force the network drive instead of deleting permanently to force the server to move it to recycle bin. By capturing the time stamp that happens along with sonarr logs will also line up.

Even if you are certain I’m wrong, wouldn’t you want to be 100% certain there isn’t a hidden bug that could Wipeout users downloads?

Let me know either way? I don’t mind heavy lifting

I’ve done what you asked.

which it will do if its told the files are ready and then it finds nothing in there at all (ie it has access and can see in there but there are no files it wants - possibly obfuscated filenames as anything close enough tends to trip a manual import/help required)

im not sure its considered a bug but there are circumstances where sonarr will see the import folder as completely empty / no valid files in it and will purge it and download whatever is next - which should not be the one it just downloaded (so you shouldnt run into the loop issue)

if you kno th3e file has the correct data you can download it via sonarr again, but then go into sab and change its category away from tv so that sonarr will not process it. you can then manually import the resulting folder, or try to, but you will at least know why it purged it if the filenames are all gibberish

ps. when attaching logs its also nice to have at least the name of the series/episodes you had issues with. saves have to trawl through them line by line and we can search instead for that particular file and work out what went wrong with it (even if there are possibly other issues in there as well)

eg, youre still getting this when you really shouldnt be by now if you had the remote path setup correctly.
20-10-5 20:03:37.4|Error|DownloadedEpisodesImportService|Import failed, path does not exist or is not accessible by Sonarr: \\liquiddozen\media\2Finished\The.Simpsons.S32E01.1080p.WEB.H264-VIDEOHOLE\. Ensure the user running Sonarr has access to the network share

if you believe you do have it setup correctly then please ensure the account sonarr is running as has access to that location (especially if its a windows service)

could you post a screen snip of your sab download client entry, and the remote path mapping please? confirm the hostname in both are exactly the same

It won’t delete if it doesn’t import anything or after import there is still a valid video file in there.

DownloadedEpisodesImportService|Processing path: \liquiddozen\media\2Finished

This is suspicious, because Sonarr should need process the root of the download folder since there is no longer Drone Factory either the download client is reporting that path or something is telling Sonarr to import that path via the API.

These are the same file.

If it imports it and there is no valid video file it will do that.

Looking at the paths that Sonarr is getting from SAB, it’s showing: Z:\\2INbound\\!nzb\\Deadliest.Catch.The.Bait.S03E03.More.Pain.Less.Ga

What is the UNC path for Z:\? Just \\liquiddozen\media\?

Definitely something here, the Host part seems to be resolved, but something with the remote or local path seems wrong.

This for sure, we need to see exactly what you’ve set up.

If you want a more real time solution your best best is joining discord and someone there can help, I can’t help via DM or by remoting into your machine.

[quote=“markus101, post:34, topic:26556”]
If you want a more real time solution your best best is joining discord and someone there can help, I can’t help via DM or by remoting into your machine.
[/quote

I just wanted you to know it is an option that I can make happen if it would help in any way.

I’ll upload the video, screen grab the settings you asked above, answer questions in about 6 hours.
]

There were tons of files and folders in the directory. a folder with 10 years worth of .torrent, downloads that were from a torrent client that I had manually downloaded, etc.
The folder wasn’t empty. The deletion happened automatically after the download completed for missing episodes that were somehow copied from another show. I hoped the log files would show it. My clients move completed files into this directory. I would have expected deletion would only happen after a successful move, and only for files expressly told to move/delete.

This is known behavior then - is this expected/designed, or a ‘need to identify and update to prevent’ situation? In either case, I gather it would help to know what was triggered that deleted it. There were only a couple of active shows, and they were not obfuscated. They were the Airplane Repo, I was trying to re-download shows that were incorrectly put into the folders. more specifically, I had Bar Rescue shows in the Airplane Repo folder on PLEX among others. I had not seen this behavior before, but it happened in the last week. I deleted the shows from Plex, went to sonar, and manually downloaded the episodes I was missing. Then BANG. blood everywhere :wink:

I enabled a recycle folder hoping it might help determine the problem. I saw support files for the show (.txt .nfo) etc there but NOT the video file that was downloaded and went into the bit bucket.

I haven’t seen the loop issue happen again in the last week. Now it is just getting the settings right and expected outcomes.

I am admittedly new to SAB/USENET for obtaining content. I never had anything get deleted until I changed the HOST to the .162

I HATE to beat what looks like a dead horse, but why would you need the root folder mapping if it is just where the Download client resides and NOT where the client PUTS its files?

The FAQ states: " Does Sonarr require a SABnzbd post-processing script to import downloaded episodes? No. Sonarr will talk to your download client to determine where the files have been downloaded and will import them automatically… If Sonarr and your download client are on different machines you will need to use Remote Path Mapping to link the remote path to a local one so Sonarr knows where to find the files."
My Sonarr and Download clients are on the same machine, so according to this, I don’t need remote Path Mapping.

In fact, in the list of questions at the top, UNC paths / remote mapping, etc are not even listed.

true. The series I had issues with technically was ALL of them since it was deleted - in other words, I was using the app as normal. I just didn’t see anything import, investigated, and Shat my pants. I normally try to include this. I included the normal sonarr file to save going to the debug hoping it would help narrow the focus.

I agree. I added the remote mappings EXACTLY as told. Host = 192.168.1.162
I haven’t been told exactly what to set for remote path and local path ( z:\ and \\liquiddozen\media OR z:\2Finished and \liquiddozen\media\1Finished) if you scroll back I have screenshots with this over and over. in the past, I had BOTH added, but I am still, despite all the back and forth, FAQ, documentation, 60+ hour of searching on Reddit, forums, etc NOT CLEAR what Remote Path should be, since either I get an error message, items don’t import at all, some items don’t import, or now the folder gets deleted.

So Sonarr, when running as a service, can’t use mapped drives, as expected. So it needs UNC paths. sure! In my mind, that means I need to help Sonarr understand that my Z: drive that is referenced in my download client needs to be translated into a UNC path. So that is why I was setting HOST to 192.168.1.172 (Server hosting files - think NAS) Remote path z:\2Finished (the setting in my client) and local path to \liquiddozen\media\2Finished (UNC translation). To me, that would be what I MUST know to allow the system to work. But I am told that isn’t the case, I need to indicate where the download Client resides. Ok, so how the hell does Sonnar ever figure out WHERE to get the files? Z drive is meaningless. .162 doesn’t have the files. Does that make sense why it is confusing? I still haven’t seen Sonarr WORK automatically - except when I changed the remote mapping to .172. But then only SOME things moved, not all. I have officially pulled ALL my hair out. :wink: I suppose so long as the Local Path is correct, it CAN find it. I just don’t get the point of HOST.

The app states HOST is “The same host you specified for the remote Download Client” huh? specified in the download client, where? what setting? there is no such setting on the download client/ HOST specified in the client. Oh, Perhaps this intends to mean "HOST name used for the Download CLients LISTED ABOVE. If you have multiple Host(s) listed, you will need multiple Remote Path Mappings, one for each host listed. (Along these lines, on qbittorrent I had used LOCALHOST and for sab, I had 127.0.0.1 because LOCALHOST didn’t work. both are valid, shouldn’t matter, but does that mean I do indeed need two remote path mappings to translate that host? According to the FAQ I Don’t need a Remote Path Mapping - which is definitely not true.

App states Remote Path is “Root path to the directory that the Download Client accesses” ok, so my Torrent client is set to “Z:\2Finished” so I would assume that is what to put in SONARR, CORRECT?

App States Local Path is “Path that Sonarr should use to access the remote path locally” so that would be \liquiddozen\media\2Finished

sorry for the long reply, I just want to to put ANY part of this to bed! :slight_smile:

The setup may have failed because I used the root of the folders for the remote mappings (since I had two, I wanted to find out which one was correct, I must have chosen the wrong one. my logic was you were obviously pulling what the client was pointing, and it should be smart enough to understand the folder structure. I was thinking "in windows, you just need to point to the server\share, the rest is handled in the path. the path to 2finished was indicated in the download client, so stop there.

In an ideal world, would be great to distill what Sonarr needs exactly and make sure the verbiage makes sense to both windows and Linux user, and perhaps have a few examples. like you have sonar on HST1, DL client on HST2 with files going to \server\share\folder1\folder2. the remote path mappings should be the following (screenshot here)

When It failed I had the following:
host: 192.168.1.162
Remote Path z:
Local Path \liquiddozqn\media

I went to add a new remote path mapping, and the HOST drop-down is blank, including the drop-down.

thanks, my bad. Here is FIXED sonarr.trace.0
Here is the requested screen shot of SAB

And finally, to ensure you understand my setup

Shared on .171 , Z drive mapped to .171\media

media -> 2finished
media -> 1 TV (plex / final resting place)
media -> 2InBound (staging area for torrents/SAB as download is underway)
media -> 2Finished (staging area for completed downloads)

root folders are where sonarr moves the imported files to - you never need a remote path mapping for that (as far as im aware), it has to be somewhere sonarr can access.

remote path mappings are for the download client completed job paths when sonarr cannot access those

on the proviso that 192.168.1.162 is also what you have in the host field for the download client this remote path mapping will apply to

its basically what part of the download clients completed job path do you want sonarr to strip out (remote path) and replace with the new value (local path) that will make the resulting path accessible to sonarr

i would expect if the mapped drive Z: is not accessible by a windows service then a UNC path may work (hopefully) so yes, a remote path of Z:\ and a local path of \\liquiddozen\media would work for that download client, as would your extended version but you only have to put in enough for it to generate a valid path from sonarrs point of view

if sonarr still spits out access errors then youll need to work out how to get a windows service to work with UNC paths

it appears that you will need to create a remote path mapping for each download client that needs one - even if its the same used by another download client (they seem to be tied to an individual download client, and not global)

thats at least 4 days old, is that correct? do you have anything more recent that contains the issues you are having?

the remote path issue (or it could be a UNC access issue) is in there but we know that one, are there other issues or is that it?

i missed this, and yes youre right, you shouldnt need any remote path mappings on the proviso that sonarr can access the completed job path presented to it by the download clients you have.

your issue is that the completed job data is not being stored locally so there is no common local path, only a remote path, to the data, that you could use across all of the apps

you (seem) have sab setup to use Z:\2Finished which we know will not work for a windows service - so it will need a remote path mapping (is sab running as a service?) - or you can change the completed path in sab to the UNC and then you wont need the mapping in sonarr

your torrent download client (is it running as a service as well?) i presume is using the \\liquidmedia UNC? not sure, but what its set for will determine if a remote path mapping is required

note - there are a few a potential solutions (applies to your windows vm) in here https://stackoverflow.com/questions/182750/map-a-network-drive-to-be-used-by-a-service for setting up a mapped drive at the system level that may (or may not) work for you. if it does work then you can set everything to use the Z: drive and not require any remote path mappings

Correct, Remote Path Mappings only apply to importing from the download client.

If Sonarr can’t access the series folder it can’t import, ever.

Not worth it, if @Cocokola can’t/doesn’t want to setup the remote path mapping correctly, then running Sonarr via the startup folder (after removing the Windows service) is the best solution, at this point giving the continued issues, that’s all I’m going to recommend doing.

I showed you screen shots of exactly what you said to do. And included the log files to show the outcome above. Like I had done several times before. I can upload a recording of the screen if you would like to see forbyourself.

Why would you insist I’m not doing what you asked? What am I not doing? Please bloom at my screen shots. I’m livid. I can’t make v it work the way you keep saying will fix it.

Here x maybe this will help you set that it isn’t me

I noticed radarr has been updated as well, and it includes additional errors Live as you enter the remote mapping.

When I set it the way you say, it access the files - it goes the directory and files, but it still shows an error unable to access. Keep in mind, I entered only \servrr\share, and it shows the folders at this level, I used the display to select the right directory… But still gets the error.

That error goes away when all I do differently is select the share that. 162 is pointing to.

Could anyone else please connect to my machine to prove this? Since my word is shite.

I’m a twenty year windows server vet. I can assure you the permissions, shares, unc paths are correct. I have seen plenty of errors that indicate access issues when it was related.

I guess would everything just be easier if I remove the service and add sonarr to the startup folder with an auto login user?

That is the log file that markus101 advised was a duplicate… The logs are tied with the issues I’m having. I’ve documented it multiple times, I can keep sending newer files but to what end?

The remark about not needing remote mapping was somewhat tongue and cheek… I’m trying to show why this is confusing, some info could be better explained.

I guess I’m stuck. I added the remote mapping, although maybe not correctly ( local = z:\ or z:\2finished AND remote = \liquiddozen\media or remote = \liquiddozen\media\2finished)
From radarr, same issue:


if radarr is having the same issue then its pretty obvious that that path is not accessible to either of them.

ie \\liquidmedia as a UNC path does NOT appear to be valid from sonarr or radarr when run as a windows service. you need to find something that is valid for them so that you can setup the remote path mapping correctly.

note - if that path requires a username/password you wont get them to work as you cannot pass that through, and windows services dont run with credentials