Nov 24th release breaks indexers

Since yesterdays update all my indexers are failing the test, including wombles. I am now missing shows and have to find them manually

Under status>health I have 2 warnings:
All indexers are unavailable due to failures
Enabled indexers do not have RSS sync enabled

That feature has been in Sonar for some time, you’re only noticing it because your indexers have been failing or telling Sonarr to rate limit requests.

Sonarr will stop using an indexer when it fails multiple times in a row, you can restart Sonarr to have it try again, or test the individual indexers.

True, I never had a warning before now but since the last update I have no working indexers, even after a restart, even wombles fails the test. I have shows that have episodes on indexes but Sonarr is not looking for and finding them.

I have just cleared the logs and restarted, will post it here later

Ok, here is the log, search for 15-11-25 20:26:07.6 to see where I cleared the database before rebooting

Sonarr can’t connect to anything:

System.Net.WebException: Unable to connect to the remote server ---> System.Net.Sockets.SocketException: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond 129.129.129.129:80
   at System.Net.Sockets.Socket.DoConnect(EndPoint endPointSnapshot, SocketAddress socketAddress)
   at System.Net.ServicePoint.ConnectSocketInternal(Boolean connectFailure, Socket s4, Socket s6, Socket& socket, IPAddress& address, ConnectSocketState state, IAsyncResult asyncResult, Exception& exception)
   --- End of inner exception stack trace ---
   at System.Net.HttpWebRequest.GetResponse()
   at NzbDrone.Common.Http.Dispatchers.ManagedHttpDispatcher.GetResponse(HttpRequest request, CookieContainer cookies) in m:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Common\Http\Dispatchers\ManagedHttpDispatcher.cs:line 54
   at NzbDrone.Common.Http.Dispatchers.FallbackHttpDispatcher.GetResponse(HttpRequest request, CookieContainer cookies) in m:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Common\Http\Dispatchers\FallbackHttpDispatcher.cs:line 57
   at NzbDrone.Common.Http.HttpClient.Execute(HttpRequest request) in m:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Common\Http\HttpClient.cs:line 70
   at NzbDrone.Common.Http.HttpClient.Get(HttpRequest request) in m:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Common\Http\HttpClient.cs:line 188
   at NzbDrone.Core.Indexers.Newznab.NewznabCapabilitiesProvider.FetchCapabilities(NewznabSettings indexerSettings) in m:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Core\Indexers\Newznab\NewznabCapabilitiesProvider.cs:line 53

Did you change all the IPs in the log to 129.129.129.129? If not it appears that something else is altering them, proxy maybe?

Ok, I think I have fixed this but not sure why it’s happened. My server is set to DHCP (static), the router handles the static addresses and gives it the same IP each time. When I did nslookup on the server I was getting a reply that I had no DNS configured, could not lookup any addresses. Ipconfig /all showed me that my first DNS server was an IPV6 address and my routers IPV4 address was second. Disabling IPV6 on the network connection and rebooting fixed it.

Going to scan for nasties now

No nasties like I expected. This is a Windows 7 home server vm, behind a firewall. I don’t use it for browsing or email, the browser was only used for software install mostly Github and the likes so I don’t even have an antivirus installed. It’s a Windows 7 bug, a strange one.

I still have the problem and your solution didnt work with me, maybe other people who fixed it another way?

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.