Scott Alexander once wrote about the difference between "mistake theorists" who treat politics as an engineering discipline (a symmetrical collaboration in which everyone ultimately just wants the best ideas to win) and "conflict theorists" who treat politics as war (an asymmetrical conflict between sides with fundamentally different interests). Essentially, "[m]istake theorists naturally think conflict theorists are making a mistake"; "[c]onflict theorists naturally think mistake theorists are the enemy in their conflict."
More recently, Alexander considered the phenomenon of "bounded distrust": science and media authorities aren't completely honest, but are only willing to bend the truth so far, and can be trusted on the things they wouldn't lie about. Fox News wants to fuel xenophobia, but they wouldn't make up a terrorist attack out of whole cloth; liberal academics want to combat xenophobia, but they wouldn't outright fabricate crime statistics.
Alexander explains that savvy people who can figure out what kinds of dishonesty an authority will engage in, end up mostly trusting the authority, whereas clueless people become more distrustful. Sufficiently savvy people end up inhabiting a mental universe where the authority is trustworthy, as when Dan Quayle denied that characterizing tax increases as "revenue enhancements" constituted fooling the public—because "no one was fooled".
Alexander concludes with a characteristically mistake-theoretic plea for mutual understanding:
The savvy people need to realize that the clueless people aren't always paranoid, just less experienced than they are at dealing with a hostile environment that lies to them all the time.
And the clueless people need to realize that the savvy people aren't always gullible, just more optimistic about their ability to extract signal from same.
But "a hostile environment that lies to them all the time" is exactly the kind of situation where we would expect a conflict theory to be correct and mistake theories to be wrong!—or at least very incomplete. To speak as if the savvy merely have more skills to extract signal from a "naturally" occurring source of lies, obscures the critical question of what all the lying is for.
In a paper on "the logic of indirect speech", Pinker, Nowak, and Lee give the example of a pulled-over motorist telling a police officer, "Gee, officer, is there some way we could take care of the ticket here?"
This is, of course, a bribery attempt. The reason the driver doesn't just say that ("Can I bribe you into not giving me a ticket?"), is because the driver doesn't know whether this is a corrupt police officer that accepts bribes, or an honest officer who will charge the driver with attempted bribery. The indirect language lets the driver communicate to the corrupt cop (in the possible world where this cop is corrupt), without being arrested by the honest cop who doesn't think he can make an attempted-bribery charge stick in court on the evidence of such vague language (in the possible world where this cop is honest).
We need a conflict theory to understand this type of situation. Someone who assumed that all police officers had the same utility function would be fundamentally out of touch with reality: it's not that the corrupt cops are just "savvier", better able to "extract signal" from the driver's speech. The honest cops can probably do that, too. Rather, corrupt and honest cops are trying to do different things, and the driver's speech is optimized to help the corrupt cops in a way that honest cops can't interfere with (because the honest cops' objective requires working with a court system that is less savvy).
This kind of analysis carries over to Alexander's discussion of government lies—maybe even isomorphically. When a government denies tax increases but announces "revenue enhancements", and supporters of the regime effortlessly know what they mean, while dissidents consider it a lie, it's not that regime supporters are just savvier. The dissidents can probably figure it out, too. Rather, regime supporters and dissidents are trying to do different things. Dissidents want to create common knowledge of the regime's shortcomings: in order to organize a revolt, it's not enough for everyone to hate the government; everyone has to know that everyone else hates the government in order to confidently act in unison, rather than fear being crushed as an individual. The regime's proclamations are optimized to communicate to its supporters in a way that doesn't give moral support to the dissident cause (because the dissidents' objective requires common knowledge, not just savvy individual knowledge, and common knowledge requires unobfuscated language).
This kind of analysis is about behavior, information, and the incentives that shape them. Conscious subjectivity or any awareness of the game dynamics are irrelevant. In the minds of regime supporters, "no one was fooled", because if you were fooled, then you aren't anyone: failing to be complicit with the reigning Power's law would be as insane as trying to defy the law of gravity.
On the other side, if blindness to Power has the same input–output behavior as conscious service to Power, then opponents of the reigning Power have no reason to care about the distinction. In the same way, when a predator firefly sends the mating signal of its prey species, we consider it deception, even if the predator is acting on instinct and can't consciously "intend" to deceive.
Thus, supporters of the regime naturally think dissidents are making a mistake; dissidents naturally think regime supporters are the enemy in their conflict.
There's potentially an aspect of this dynamic that you're missing. To think an opponent is making a mistake is not the same thing as them not being your opponent (as you yourself point out quite rightly, people with the same terminal goals can still come into conflict around differences in beliefs about the best instrumental ways to attain them), and to think someone is the enemy in a conflict is not the same thing as thinking that they aren't making mistakes.
To the extent that Mistake/Conflict Theory is pointing at a real and useful dichotomy, it's a difference in how deep the disagreement is believed to lie, rather than a binary between a world of purely good-faith allies who happen to be slightly confused and a world of pure evil monsters who do harm solely for harm's sake. And that means that in an interaction between dissidents and quislings, you probably will get the dynamic that Zack is pointing out.
Dissidents are likely to view the quislings as being primarily motivated by trying to get personal benefits/avoid personal costs by siding with the regime, making the situation a matter of deliberate defection, aka Conflict Theory. Quislings are likely to view dissidents (or at least to claim to) as misguided (the Regime is great! How could anyone oppose it unless they were terminally confused?), aka Mistake Theory. However, this Mistake Theory perspective is perfectly compatible with hating dissidents and levying all manner of violence against them. You might be interested in watching some interviews with pro-war Russians about the "Special Military Operation": a great many of them evince precisely this perspective, accusing Ukrainians of making insane mistakes and having no real interests opposed to Russia (i.e. they don't view the war through Conflict Theory!), but if anything that makes them more willing to cheer on the killing of Ukrainians, not less. It's not a universal perspective among Putin's faithful, but it seems to be quite common.
The dynamic seems to be not so much that one side views the other with more charity ("oh, they're just honestly mistaken; they're still good people") so much as that one side views the other with more condescension ("oh our enemies are stupid and ignorant as well as bad people").