Stephen

Eli: It seems like it would be much better to use the original name "relative state" rather than "many worlds". The word "many" suggests that they can be counted. However, in standard QM we are usually talking about particles whizzing around in the continuum, which gives us an infinite-dimensional Hilbert space. If we restrict ourselves to Hilbert spaces of finite dimension, for example the states of some spins, then naively counting worlds remains bogus, because the number of "worlds" (i.e. entries of the state vector) with nonzero amplitude depends entirely on choice of basis. I suppose in a finite dimensional Hilbert space we could make a sensible definition of world counting as follows: the answer to how many worlds am I in is the rank of my reduced density matrix. However, this seems far removed from the main point of the "MWI". Furthermore, it appears that the term many worlds does actually lead people astray in practice. In the posts many people keep referring to counting the worlds in which something happens in order to assess probability. This is wrong. The probabilities arise from squaring amplitudes, not from counting. If the probabilities arose from counting then in a finite dimensional Hilbert space, all the probabilities would be rational numbers. Standard QM does not have this property.

Mitchell Porter: "There is no relativistic formulation of Many Worlds; you just trust that there is...You also haven't said anything about the one version of Many Worlds which does produce predictions - the version Gell-Mann favors, "consistent histories" - which has a distinctly different flavor to the "waves in configuration space" version."

I think you are mistaken. It seems to me that consistent histories is basically just many worlds from a different point of view. Basically, both are standard QM with no collapse. In consistent histories you look at things from the point of view of path integrals instead of a wave equation. These are just two equivalent mathematical formalisms. Path integrals adapt more easily to the relativistic case, but it doesn't seem to me that the interpretational issues are any different. Secondly, I'm not sure what you mean that consistent histories "produces predictions." I'm pretty sure that consistent histories does not make any quantitative prediction that differs from standard quantum mechanics and quantum field theory.

"If you *didn't* know squared amplitudes corresponded to probability of experiencing a state, would you still be able to derive "nonunitary operator -> superpowers?""

Scott looks at a specific class of models where you assume that your state is a vector of amplitudes, and then you use a p-norm to get the corresponding probabilities. If you demand that the time evolutions be norm-preserving then you're stuck with permutations. If you allow non-norm-preserving time evolution, then you have to readjust the normalization before calculating the probabilities in order to make them add up to 1. This readjustment of the norm is nonlinear. It results in superpowers. The paper in pdf and other formats is here.

Psy-Kosh:

"Or did I completely and utterly misunderstand what you were trying to say?"

No, you are correctly interpreting me and noticing a gap in the reasoning of my preceeding post. Sorry about that. I re-looked-up Scott's paper to see what he actually said. If, as you propose, you allow invertible but non-norm-preserving time evolutions and just re-adjust the norm afterwards then you get FTL signalling, as well as obscene computational power. The paper is here.

I'm struck by guilt for having spoken of "ratios of amplitudes". It makes the proposal sound more specific and fully worked-out than it is. Let me just replace that phrase in my previous post with the vaguer notion of "relative amplitudes".

Psy-Kosh:

Good example with the Lorentz metric.

Invariance of norm under permutations seems a reasonable assumption for state spaces. On the other hand, I now realize the answer to my question about whether permutation invariance narrows things down to p-norms is no. A simple counterexample is a linear combination of two different p-norms.

I think there might be a good reason to think in terms of norm-preserving maps. Namely, suppose the norms can be anything but the individual amplitudes don't matter, only their ratios do. That is, states are identified not with vectors in the Hilbert space, but rays in the Hilbert space. This is the way von Neumann formulated QM, and it is equivalent to the now more common norm=1 formulation. This also seems to be the formulation Eli was implicitly using in some of his previous posts.

The usual way to formulate QM these days is, rather than ignoring the normalizations of the state vectors, one can instead just decree that the norms must always have a certain value (specifically, 1). Then we can assign meaning to the individual amplitudes rather than only their ratios. It seems likely to me that theories where only the ratios of the "amplitudes" matter, generically can be equivalently formulated as a theory with fixed norm. Thinking that only ratios matter seems a more intuitive starting point.

"I will point out, though, that the question of how consciousness is bound to a particular branch (and thus why the Born rule works like it does) doesn't seem that much different from how consciousness is tied to a particular point in time or to a particular brain when the Spaghetti Monster can see all brains in all times and would have to be given extra information to know that my consciousness seems to be living in *this* particular brain at *this* particular time."

Agreed!

More generally, it seems to me that many objections people raise about the foundations of QM apply equally well to classical physics when you really think about it.

However, I think Eli's objection to the Born rule is different. The special weird thing about quantum mechanics as currently understood is that Born's rule seems to suggest that the binding of qualia is a separate rule in fundamental physics.

"Given the Born rule, it seems rather obvious, but the Born rule itself is what is currently appears to be suspiciously out of place. So, if that arises out of something more basic, then why the unitary rule in the first place?"

While not an answer, I know of a relevant comment. Suppose you assume that a theory is linear and preserves some norm. What norm might it be? Before addressing this, let's say what a norm is. In mathematics a norm is defined to be some function on vectors that is only zero for the all zeros vector, and obeys the triangle inequality: the norm of a+b is no more than the norm of a plus the norm of b. The functions satisfying these axioms seem to capture everything that we would intuitively regard as some sort of length or magnitude.

The Euclidian norm is obtained by summing the squares of the absolute values of the vector components, and then taking the square root of the result. The other norms that arise in mathematics are usually of the type where you raise the each of the absolute values of the vector components to some power p, then sum them up, and then take the pth root. The corresponding norm is called the p-norm. (Does somebody know: are all the norms invariant under permutation of the indices p-norms?) Scott Aaronson proved that for any p other than 1 or 2, the only norm-preserving linear transformations are the permutations of the components. If you choose the 1-norm, then the sum of the absolute values of the components are preserved, and the norm preserving transformations correspond to the stochastic matrices. This is essentially probability theory. If you choose the 2-norm then the Euclidean length of the vectors is preserved, and the allowed linear transformations correspond to the unitary matrices. This is essentially quantum mechanics. (Scott always hastens to add that his theorem about p-norms and permutations was probably known by mathematicians for a long time. The new part is the application to foundations of QM.)

Nick: I don't understand the connection to quantum mechanics.

The argument that I commonly see relating quantum mechanics to anthropic reasoning is deeply flawed. Some people seem to think that many worlds means there are many "branches" of the wavefunction and we find ourselves in them with equal probability. In this case, they argue, we should expect to find ourselves in a disorderly universe. However, this is exactly what the Born rule (and experiment!) does not say. Rather, the Born rule says that we are only likely to find ourselves in states with large amplitude. Also, standard quantum mechanics allows the probabilities to fall on a continuum. They aren't arrived at by counting, so the whole concept of counting branches is not standard QM anyway.

(I don't know whether you hold this view, but it is a common misconception that should be addressed at some point anyway.)

I have also found Eliezer's series of posts worthwhile, and would like to thank him for writing them. They have improved my thinking on certain topics. I also do not object to his writing on quantum mechanics. First, I don't believe he has been wrong about any major point, and that fact trumps any considerations of his qualifications. Second, to a large extent his QM posts are about thought processes by which one can reach certain conclusions about quantum mechanics. Such cognitive science stuff is squarely within Eliezer's claimed area of expertise. The conclusions themselves are fairly mainstream. (As far as I can tell, among the physicists who have bothered to think about it, very few these days would claim that measurements are somehow special processes that collapse wavefunctions, in contrast to ordinary processes that do not. Whether they describe their beliefs using the term "many worlds" is another matter.)