Turn off scene mapping option per series or per profile for web-dl

It’s pretty frustrating that there are correctly named, high quality releases (web-dl), which sonarr is able to detect and associate with the correct episode via site integration, but then it chooses to reject them due to the fact that they don’t match the incorrect naming convention of the lower quality (hdtv rip) scene releases, which have to be remapped through extensive effort (thexem) back to the correct names. Wouldn’t it make sense to be able to disable that, either globally, or by series, or ideally by profile, so that we can have access to releases from high quality sites that use correct (non-scene) naming for their non-scene releases?

It would be especially awesome to be able to set up a profile that can pull hdtv releases (often released sooner), and rename those, and then upgrade them with web-dl, with the expectation that the web-dl will be named correctly as-is.

In the profile settings page, when you’re creating or editing a profile, you select which qualities to grab and put them in preference order. It would be great if you could toggle scene-mapping on and off per quality level in that view, perhaps hidden under an “advanced” toggle.

American Dad is a good example case, where, if you want things to be automatic, you have to have global file renaming activated AND you can’t grab web-dl or bluray/dvd releases, because the naming convention won’t match the scene naming, so sonarr will just refuse to grab them, even though it sees them when doing a manual search, and therefore it knows that the site has matched the specific release to the proper episode. So you’re stuck with low quality HDTV rips or manually downloading every episode.

A little extract from When will Sonarr allow mapping edits? - #7 by Taloth

There has been talk about the requirements for TheXEMv2 (including mappings that are currently impossible to do), but time is scarce and there aren’t actually a lot of ppl (read: none) volunteering to work on it… I don’t blame em, coz designing a complete backend and fluent webui for it isn’t something you do in a weekend… or even several weekends.
But it demonstrates a symptom I often notice: It’s easy to complain (or even rant) about lacking features or UI, but ppl rarely offer true ‘community’ solutions.

That was a discussion about TheXEM specifically, not Sonarr. But it’s basically about the same thing.

As for your question: we won’t add an option to Sonarr. The problem needs to be fixed by creating the next generation of TheXEM. That way “it just works” for everyone.
However, the new WEB+WEBRip scene rules will likely complicate it.

That’s unfortunate, since that means low quality scene-only downloads + required global renaming for all series, if any of them use scene naming that doesn’t match thetvdb, and the functionality to handle this correctly already exists in the sonarr.

I don’t see how thexem can resolve the issue, since sonarr already has an internal model for both naming conventions. Conceptually, sonarr either has to use this knowledge or not, and it’s opaque from the perspective of sonarr / thexem as to whether some particular release is using one naming convention or the other. There’s no way to tell externally aside from the site integration that sonarr already has.

That’s completely wrong, and indicates you’re only considering the edgecases you’ve run into.
Whether a specific releasegroup follows tvdb or their own mapping is the knowledge. And thus requires a next gen Thexem to store that knowledge.
Some examples:

  • American Dad, p2p follows tvdb.
  • Starwars rebels, yfn follows their own numbering.
  • Spartacus gods of the arena, maps to s00.
  • Bar Rescue, an even bigger mess even within scene.

I could whip up a few more examples but I’m not at home. Doesn’t matter either, it’s not about these specific cases.
The gist is that the only way to fix this, is by a next generation thexem that supports multiple concurrent mappings.

I see what you’re saying, but it doesn’t seem to hold up. I checked each of the series you listed.

For all of these series, I tried the following: Click a show in the series list. The episode list now appears that follows the correct (tvdb) format. This is the representation or naming scheme that sonarr chooses to use for us to search for episodes with. It also happens to be the correct representation. At this point, click the manual search button on an episode to search for that episode. Several episode releases will appear, and all of them map to the correct episode, even though some are named incorrectly (scene). Whatever just happened, that’s the key! The proper release is now listed under the correct episode number regardless of name of the release. Note that we haven’t actually downloaded anything yet or done any name remapping. By my understanding, we should already be done at this point. All of the episodes in this list we just acquired are valid for the entry, and can be renamed appropriately with the information we used to get to this list in the first place.

Then, sonarr chooses to reject certain releases in that list, because they don’t match the scene naming convention dictated by thexem, which we will then remap back to the episode number we already searched for and found. That logic is insane to me given what we must have known to be able to perform the search and get the release list in the first place. Help me understand what I’m missing here! :slight_smile:

I’m just spitballing here, but I’m guessing that the mechanism by which we were able to get the correct episode is via site integration. The release site has done the work for us by categorizing their releases such that they fall under a specific episode number that matches up with our internal model of episode numbers (and matches thetvdb). That’s why all of the work is already done.

Note that this applies to episode releases, but for season packs, where the critical mapping is happening inside the release (rather than being the release itself), this logic may not hold up. That distinction may actually help explain the point I’m trying to get across.

Maybe they did the mapping for us, but maybe they suck and returned a bunch of random results that Sonarr can’t assume match the episode correctly. A prime example of this case is an indexer that only supports title based searching and isn’t filtering on the season and episode number sent with the request, Sonarr ends up getting back a bunch of results for series and episodes that are part of the same series, but are for completely different episodes. Is that a bit extreme, maybe, but what if there is a mapping on thexem.de and Sonarr gets results for both the xem mapped and the TVDB numbering in that same scenario, by trusting the indexer has remapped things correctly Sonarr could end up grabbing the wrong result because is matches one of the numbering schemes Sonarr is aware of.

One issue that exists with the current system is the releases that match the TVDB numbering would be be interpreted incorrectly and replaces another episode, its bound to happen if it hasn’t already, its a symptom of the same problem.

As Taloth said this isn’t something we’re going to add to Sonarr, not because it couldn’t work, but because it becomes a messy solution that people need to implement on their own (and something we would need to support as they inevitably do it wrong), which goes against using thexem or community based solutions (like scene name mapping) at all, where it can be fixed once and works for everyone.

While is sounds like we’re washing our hands of it because its not something we’re going to implement its because Sonarr isn’t the right place to fix it and we’ve been involved in thexem v2 discussions and its something that we want to see happen and will contribute to as best we can.

Yes, this happens now. There are correctly labeled releases that are totally consistent across all interactive levels of sonarr, thetvdb, the indexer, the release, that don’t get picked up because they don’t match the hidden-under-the-hood scene naming conventions in thexem.

Catch me if I’m wrong, but if we can’t trust the indexer to produce the correct result, then we’re screwed anyway, because we depend on the indexer to get the results. Sure, there are edge cases where an incorrect indexer gets lucky and just so happens to provide both correct and incorrect results together. That could then be filtered if thexemv2 existed and provided a release-specific mapping for every release on every supported indexer. That’s a lot of conditions that need to be met. And, ultimately, we already put such a huge faith in the correctness of the indexer that the idea of building something to solve the case where the indexer is wrong seems like a solution looking for a problem. What if the indexer doesn’t produce the correct result at all? If the indexer is a public free-for-all, and is so disorganized that it can’t organize it’s own data, why should we trust the contents of a release? If we can’t trust the indexer to curate, what do we do about fake releases that copy scene naming conventions and contain malware?

To me, it makes way more sense to leave that in the hands of the indexers, which, by my measure are doing a bang-up job and work universally already. The devil’s advocate situation of non-curated indexers is a rabbit-hole that I don’t think can really be solved.

It seems to me like we’re prioritizing a complex system to solve a non-existent problem (granted, a complex problem) at the cost of brokenness today, all because we’re choosing not to trust the indexers which we completely depend on regardless. Why not just trust them and leverage those communities to manage their data? What makes thexem and that community more capable than the tracker and their communities? Even when thexem v2 exists (which, in the meantime, however long that is, we’re pretty stuck), is it realistic to expect that thexem community is going to be more correct or outperform the tracker communities? They have all the benefits of thexem for moderation over their releases, plus the benefit of timeliness - the release group knows which release they are creating and can provide that information to the tracker in advance for immediate correctness. And, many trackers have their own releases, so even the idea that you would be saving labor by consolidating mappings seems overly optimistic. I’m sorry for rambling, but I guess I just want to make sure you’re understanding me. I think there is a lightbulb worthy point or two in here.

We don’t trust them, thats why results are parsed and processed.

This happens, no results means there is nothing to process, if it was an ID search that failed then it will fall back to title, otherwise nothing is processed.

The contents aren’t blindly trusted, if there isn’t a video file or one with an unexpected extension Sonarr won’t process it and it can be removed from the queue/handled manually.

Minimum size helps, if meets the requirements and its a junk file in there then it won’t get processed.

There are indexers that are curated that don’t fix the numbering as well though.

For arguments sake lets say we trust the indexer completely, whatever they return in search results is correct. How would the many entry points to process a release function? RSS Sync, Release Pushing, drone factory or blackhole imports. Blackhole/drone factory would be the most obvious flaw, Sonarr grabbed a release trusting the indexer’s data, it gets shipped to the download client and when it comes to import it looks at the numbering, does it match it to thexem numbering or thetvdb? At this point the source is a scene source so thexem makes sense, but if it was grabbed via mapping to the TVDB numbering w have no idea because we don’t know that the release grabbed is the same that is being imported, without guessing, somewhat educated, but still guessing.

While the current system isn’t perfect, its consistent, inconsistency isn’t better, it makes things harder to troubleshoot and for the majority of users, harder to understand.

I don’t think there is anything else to really say, we’re not looking to make changes.

Maybe I’m thinking too highly of the indexers and am just using a rare “good one”. I hope thexemv2 can wrangle this complexity. Maybe for my case it would be better to just rely on RSS rules with my indexer over sonarr, which is trying to tackle a broader issue that is already handled in the case of my specific indexer. I really like the replace with higher quality and hard link copying features, it will be hard to give those up.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.