Sonarr version (exact version): 2.0.0.3953 Mono version (if Sonarr is not running on Windows):4.2.1 (Stable 4.2.1.102/6dd2d0d Thu Nov 12 09:52:44 UTC 2015) OS: Ubuntu 14.04 ((Debug logs)) (posted to hastebin or similar):
It would be useful to have ((debug logs)), not just the warning, or better yet ((trace logs)) so we have all the information required. The error might be related to ciphers not supported by mono and cURL not being available.
If NZBGet and Sonarr are on the same system you don’t need SSL in the middle, if anyone was able to sniff that traffic it’d be too late as the machine was already compromised, even if its on the same network that you control you’ll be screwed if anyone can intercept it since they already have access.
“If NZBGet and Sonarr are on the same system you don’t need SSL in the middle,”
This is only true if im doing the communication between the services over localhost rather than the external address correct?
I am still relatively new to networking and whatnot. I have all my services up and working together, but im attempting to get it all secure. This is pretty much my first experience with ssl.
I have all my services available over external urls, but im thinking the communication between them all should be handled internally still, would that be the best practice in this situation?
I was unable to connect via localhst or 127.0.0.1, but using the internal 192.xxx ip, i have sonarr and nzbget communicating again.
From my understanding, anything i directly interact with needs to be available @ its external address over https. Communication between the services should be handled internally with no need for https?
Its fine for localhost/127.0.0.1/local private IP address of the computer, they all stay within the same system.
If you’re accessing services externally, over SSL is definitely recommended. I don’t see an advantage connecting to NZBget through a reverse proxy though, extra effort and its not going to provide meaningful security.
When i setup all my services i wanted to give them all clean urls, I may be technical to deal with ports and all that, but not everyone in the house is. sticking nzbget behind the proxy allows me to define a clean url to give to other people in the house, nzbget.ourdomain.org, rather than 192.168.1.123:1234 or something.
All of my services when reached via the external are now over ssl, and all the inter-server communications - cp > nzbget, or sonarr > nzbget, etc - are done via the internal ip.
I know you didnt actually resolve my original issue, but you pointed out that what i was doing was unnecessary to begin with and gave me a better solution, so id say that this issue is resolved or however you mark them here. Thanks!