Sonarr version (exact version): 4.0.10.2544
Mono version (if Sonarr is not running on Windows): 6.12
OS: unraidOS 6.12.4
Debug logs:
Debug Logs
TraceLogs
Description of issue:
Loading the homepage for Sonarr shows a message: Sequence contains more than one matching element
Looks like something is wrong with the series API and from other posts seems like possibly an issue with the tvdb api returning multiple entries for a series? I poked around the api and it looks like /api/v3/series returns this error. I can’t seem to figure out which series is causing it and what I need to do to resolve this. Any ideas?
Also, not sure how to find my mono version, which mono
doesn’t seem to return anything in the Sonarr docker container or on the unraid host. The specific docker image I have for Sonarr is from linuxserver, so if I trust the github page it should be mono 6.12 as of April 28, 2022
The most recent thing I did was updating to v4, as well as deleting content from some series being tracks.
v4 no longer uses mono.
It looks like there is a series with duplicate seasons, which causes this to fail, the upgrade to v4 wouldn’t have caused that, but something wrong coming from TheTVDB.
Outside of looking at the DB tracking down the problematic series is going to be near impossible, though we should be able to fix the issue with it failing completely like you’re seeing.
Hey Markus, thanks for the response. I guess I’ll wait for a fix. Is there anything I can do to try and figure out what series is causing it? I tried querying the api directly by series ID queried from the sonarr.db file, I was looking to see if there were any returning multiple entries but I didn’t see any specifically. I also noticed that a lot of them were 404’ing though and I paused there to make this forum post.
I’m handy enough with this kind of stuff that I can just run some queries and api calls to try and see which series is returning multiple entries but I don’t know enough about the structure of the database & how the api calls work. If you have any starting point I can keep hacking away at it. On the other hand if you think the fix would be relatively fast then maybe I’ll just wait.
Thanks again!
It’ll be a duplicate season in the Seasons
column of the Series
table in sonarr.db
, but since it’s a JSON string in there you probably can’t easily query for it.
ok, I went an dumped everything, used some grep + regex magic & I don’t see a duplicate season in the series table.
Here’s the formatted output in case I’m just blind and you can see a duplicate. Is there anything else to check for? I will say just before this I was having an odd issue with when I was changing the monitoring status in the daily show where the ui looked like it was bugging out and constantly flipping one of the seasons between monitored and unmonitored. It’s happened to me in the past without any consequence so I wasn’t sure if that was related. I was also in the middle of removing a lot of local files via Sonarr.
I attached the entry for the daily show here in case there’s something interesting there…
I also ran the queries against ../appdata/sonarr/sonarr.db
, so let me know if that’s not the right place either.
Also, don’t know if this is interesting or not, but I also don’t see any errors when I run this:
for S in $(sqlite3 /mnt/user/appdata/sonarr/sonarr.db "SELECT Id FROM Series;") ; do curl -s -X GET http://localhost:8989/api/v3/series/$S?apikey=<key>; done
I just get what appears to be the correct ouput.
Are there any errors when the series refresh happen? Kicking one off via the API Sonarr API Docs (command name is refreshSeries
) for all series should refresh all series, even ended ones.
Fetching a single series does the same lookup, so not sure why that would be fine and broken when getting all series.
I just get an empty response back when send the command for refreshSeries
. I tailed the debug log file when I ran the command and saw no errors come back.
Thanks for checking, that should update everything and clear out any duplicate seasons, so possibly not that, though I don’t see how it could be anything else.
Would you be able to DM me a copy of your database with download clients, indexers and connections removed? That should strip out any passwords/API keys and let me see what might be going on.
Hey Markus, I DM’d you the copy of the database using mega to upload it since it was ~500MB. let me know if you’d prefer it somewhere else. Thanks for all the help here!
Despite no obvious errors or corruption your DB was heavily corrupted. The first series I found was with the ID 337, so I used that. When getting all series that particular series had 30 seasons, but returned 31, both season 4 and 6 were duplicated and season 7 just didn’t exist.
When I hit the API for just that series there were over 495 seasons, some of which were duplicated, but many appeared to match IDs for episodes, not season numbers.
You can run this particular query on the corrupt and recovered copy to see what I mean:
SELECT "Episodes"."SeriesId" AS SeriesId,
"Episodes"."SeasonNumber",
COUNT(*) AS TotalEpisodeCount,
-- SUM(CASE WHEN "AirDateUtc" <= '2024-11-30 11:45:13PM' OR "EpisodeFileId" > 0 THEN 1 ELSE 0 END) AS AvailableEpisodeCount,
-- SUM(CASE WHEN ("Monitored" = 1 AND "AirDateUtc" <= '2024-11-30 11:45:13PM') OR "EpisodeFileId" > 0 THEN 1 ELSE 0 END) AS EpisodeCount,
-- SUM(CASE WHEN "EpisodeFileId" > 0 THEN 1 ELSE 0 END) AS EpisodeFileCount,
-- MIN(CASE WHEN "AirDateUtc" < '2024-11-30 11:45:13PM' OR "Monitored" = 0 THEN NULL ELSE "AirDateUtc" END) AS NextAiringString,
-- MAX(CASE WHEN "AirDateUtc" >= '2024-11-30 11:45:13PM' OR "Monitored" = 0 THEN NULL ELSE "AirDateUtc" END) AS PreviousAiringString,
MAX("AirDate") AS LastAiredString
FROM "Episodes" WHERE ("Episodes"."SeriesId" = 337)
GROUP BY "Episodes"."SeriesId" , "Episodes"."SeasonNumber"
After recovering the DB by dumping and restoring everything was working again.
PS, I should have been more specific with removing those items, removing the entire table wasn’t what I meant (DELETE FROM X
) would have worked much better.
Thanks for the help on this! Sorry about that I just thought I’d drop the tables, hope it didn’t make the troubleshooting too much harder. After recovering everything seems to be working great!
1 Like