I'd appreciate knowing why this post has been downvoted. If you've downvoted this question post, I would be grateful if you could explain why. Please don't downvote it to below 0 unless you have an explanation. I say this partly because it is a question, so it's important to me that it is actually seen! Further, I struggle to understand how a question like this could be objectionable.
While I didn't downvote it, I have a potential explanation. I think that the ability to acausally communicate with other universes is either absent[1] or contradicts most humans' intuitions. As far as I understand acausal trade (e.g. coordination in The True One-Shot Prisoner's Dilemma)[2], it is based on the assumption that the other participant will think like us once it actually encounters the dilemma.
Additionally, the line about "theorems which say that the more complex minds will always output the same information as the simpler ones, all else (including their inputs, which is to say there sense-data) being equal" reminds me of Yudkowsky's case against Universally Compelling Arguments.
However, @Wei Dai's updateless DT could end up prescribing various hard-to-endorse acausal deals. See, e.g. his case for the possibility of superastronomical waste.
Unlike this one-shot dilemma, the iterated dilemma is likely to provide agents with the ability to coordinate by evolution alone with no intrinsic reasoning. I prepared a draft on the issue.
It's possible to imagine two separate sub-universes, causally isolated from one another, each containing a complex, conscious, intelligent creature whose mind consists of interacting spinor fields and potentials, as well as another, computationally simpler creature interacting with it.
For the purpose of this thought experiment, it's helpful to assume that physics is fundamentally continuous within these universes, and that, given the continuity of their physical substrate, these creatures operate as analogue computers.
Suppose that there exist theorems which say that the more complex minds will always output the same information as the simpler ones, all else (including their inputs, which is to say there sense-data) being equal. This is because it turns out that the more complex creatures' minds are mathematically redundant in some way, in that their states are particular members of infinite, continuous equivalence classes which correspond to the states of the simpler creatures' minds. Perhaps there are gauge transformations which would modify the conscious experience of one of the complex creatures when applied to the potential inside its mind, but its output is invariant with respect to these transformations as it only depends on the result of somehow differentiating this potential. Or maybe its mind consists of spinors, mathematical objects like vectors whose signs flip when they experience a complete rotation, but the relationship between its sensory inputs and its externally visible behaviours is not contingent on their overall sign, even though it can feel this sign. [1]
We assume that each simpler creature exists in the sub-universe which does not contain its counterpart, but does contain the other more complex creature.
By directly exchanging information with one another's counterparts within each mini universe, is it reasonable to assert that each more complex creature acausally communicates with the other? Or are the simpler creatures communicating through the more complex ones? I would interpret this as communication but am unsure as to which of these two things happens.
If you have read and agree with the post about Homomorphically encrypted consciousness and its implications , then, if I understand it correctly, you might be inclined not to think that this is possible. However, it seems likely that arbitrarily complex phenomena could play out within the degrees of freedom contained in the redundancy between different mathematical objects of this kind, which, to me, seems to suggest that it is possible for these phenomena themselves to be conscious, which would necessarily make the conscious experiences of these minds differ.