Mentioned in

Are imaginary and complex numbers of decibans meaningful?

16Paul Crowley

1DataPacRat

6[anonymous]

5Luke_A_Somers

11Eliezer Yudkowsky

8MrMind

2DataPacRat

5Kaj_Sotala

14Qiaochu_Yuan

0Douglas_Knight

3Luke_A_Somers

2ThrustVectoring

1DanielLC

0ChristianKl

0DataPacRat

New Comment

15 comments, sorted by Click to highlight new comments since: Today at 9:07 AM

After these first two responses, I'm tempted to write something like this: "While complex numbers are theoretically possible here, unless quantum superpositions and eigenstates are important to what is being asserted, the imaginary component will be so small as to be negligible, and should be discarded."

No, in accordance with whatchamacallit's law.

If you end up with complex probabilities, you won't be able to plug them into an expected utility formula to get a preference ordering. This has always been the knockdown argument for quantitatively scaled real-number subjective probabilities in my book. Even if underlying physics turns out to use complex-numbered reality fluid, I don't see how I can make choices if my degree of anticipation for something happening to me is not a real number - I don't know of any complex analogue of the von Neumann-Morgenstern theorem which yields actual decision outputs.

After the first dozen responses, I'm currently thinking of writing something along the lines: "While the unusual math of noncommutative probabilities allows for complex probabilities, which have applications in quantum superpositions and eigenstates, there is little likelihood of any practical application involving (the protocol). A (protocol) statement may be written with a complex number for its confidence, but a (protocol) reader or interpreter need only concern itself with the real portion of that number."

Either that, or just stating 'real numbers only'.

(PS: I've never written anything which has even a chance at being an Internet Draft, let alone an RFC; but if the tag: URI made it in, nym: just might pass muster, too - and I would welcome any and all advice.)

A little while back, Behavioral and Brain Sciences showcased a paper arguing that quantum probabilities might be useful for cognitive modeling. I still haven't read more than a bit of the beginning, but it being in BBS means that it can't be entirely bad.

Do note that they explicitly do **not** suggest that quantum mechanics would have anything to do with the way the brain functions:

We note that this article is not about the application of quantum physics to brain physiology. This is a controversial issue (Hammeroff 2007; Litt et al. 2006) about which we are agnostic. Rather, we are interested in QP theory as a mathematical framework for cognitive modeling. QP theory is potentially relevant in any behavioral situation that involves uncertainty. For example, Moore (2002) reported that the likelihood of a “yes” response to the questions “Is Gore honest?” and “Is Clinton honest?” depends on the relative order of the questions. We will subsequently discuss how QP principles can provide a simple and intuitive account for this and a range of other findings.

Rather, they take the general mathematical framework of quantum probability and apply it to cognitive phenomena:

But what are the features of quantum theory that make it a promising framework for understanding cognition? It seems essential to address this question before expecting readers to invest the time for understanding the (relatively) new mathematics of QP theory.

Superposition, entanglement, incompatibility, and interference are all related aspects of QP theory, which endow it with a unique character. Consider a cognitive system, which concerns the cognitive representation of some information about the world (e.g., the story about the hypothetical Linda, used in Tversky and Kahneman’s [1983] famous experiment; sect. 3.1 in this article). Questions posed to such systems (“Is Linda feminist?”) can have different outcomes (e.g.,“Yes, Linda is feminist”). Superposition has to do with the nature of uncertainty about question outcomes. The classical notion of uncertainty concerns our lack of knowledge about the state of the system that determines question outcomes. In QP theory, there is a deeper notion of uncertainty that arises when a cognitive system is in a superposition among different possible outcomes. Such a state is not consistent with any single possible outcome (that this is the case is not obvious; this remarkable property follows from the Kochen–Specker theorem). Rather, there is a potentiality (Isham 1989, p. 153) for different possible outcomes, and if the cognitive system evolves in time, so does the potentiality for each possibility. In quantum physics, superposition appears puzzling: what does it mean for a particle to have a potentiality for different positions, without it actually existing at any particular position? By contrast, in psychology, superposition appears an intuitive way to characterize the fuzziness (the conflict, ambiguity, and ambivalence) of everyday thought.

Entanglement concerns the compositionality of complex cognitive systems. QP theory allows the specification of entangled systems for which it is not possible to specify a joint probability distribution from the probability distributions of the constituent parts. In other words, in entangled composite systems, a change in one constituent part of the system necessitates changes in another part. This can lead to interdependencies among the constituent parts not possible in classical theory, and surprising predictions, especially when the parts are spatially or temporally separated.

In quantum theory, there is a fundamental distinction between

compatibleandincompatiblequestions for a cognitive system. Note that the terms compatible and incompatible have a specific, technical meaning in QP theory, which should not be confused with their lay use in language. If two questions, A and B, about a system are compatible, it is always possible to define the conjunction between A and B. In classical systems, it is assumed by default that all questions are compatible. Therefore, for example, the conjunctive question “are A and B true” always has a yes or no answer and the order between questions A and B in the conjunction does not matter. By contrast, in QP theory, if two questions A and B are incompatible, it is impossible to define a single question regarding their conjunction. This is because an answer to question A implies a superposition state regarding question B (e.g., if A is true at a time point, then B can be neither true nor false at the same time point). Instead, QP defines conjunction between incompatible questions in a sequential way, such as “A and then B.” Crucially, the outcome of question A can affect the consideration of question B, so that interference and order effects can arise. This is a novel way to think of probability, and one that is key to some of the most puzzling predictions of quantum physics. For example, knowledge of the position of a particle imposes uncertainty on its momentum. However, incompatibility may make more sense when considering cognitive systems and, in fact, it was first introduced in psychology. The physicist Niels Bohr borrowed the notion of incompatibility from the work of William James. For example, answering one attitude question can interfere with answers to subsequent questions (if they are incompatible), so that their relative order becomes important. Human judgment and preference often display order and context effects, and we shall argue that in such cases quantum theory provides a natural explanation of cognitive process.

Taking the log moves nonzero complex numbers to complex numbers, so it makes exactly as much sense as expecting to observe complex odds.

Odds are the ratios of outcomes. How might you get complex outcomes like that?

Well, if the outcomes are quantum states, you could get something like 0.5 |a> + 0.5(1+i) |b> - 0.5|c> in which case the odds of each outcome are 1:2:1 if you make them decohere right then. But let's suppose you don't decohere them and work with the amplitudes.

What would taking the logarithm do to these things?

If your states are energy eigenstates, taking the log has the interesting effect of extracting the time dependence, which would manifest as an oscillation between various cases of interference between the states.

If the states are angular momentum eigenstates, taking the log would extract the angle dependence of the outcome. What this means depends more on your context than the first case. Also, you're not going to be able to predict the time dependence from this (the combined state might already be an energy eigenstate, which would help).

And so on, with complementary pairs of observables. If they're eigenstates of one observable operator in a pair of complementary observables, this will encode some information about the dependence on the complementary operator.

If your choice of states don't imply (as far as you can tell) a complementary operator, you're out of luck.

In short:

The real part of your deciban figure would be the log of the amplitude. That acts pretty much like you're used to it acting for normal real quantities. Just throw in a correction factor of 2 to use the squared amplitude instead of the amplitude.

To make any use of that imaginary part, you'd need to know a lot of details about the exact definitions of your states and do a lot of math. If you don't, then you just throw out the imaginary part of that log and use the real part.

x decibans means 10^(x/10):1 odds. Using that, a+b i decibans means 10^((a+b i)/10):1 = 10^(a/10)10^(b i/10):1 = 10^(a/10)e^(ln(10)b i/10):1 = 10^(a/10)e^(2 pi b ln(10)/(20 pi) i):1 = 10^(a/10)(cos(2 pi ln(10)/(20 pi))+i sin(2 pi ln(10)/(20 pi))):1. It's just a complex number, unless the imaginary part is divisible by ln(10)/(20*pi), in which case it's the same as just using the real part. A non-real deciban is no more meaningful than a non-real probability.

Beyond the complex numbers are a whole host of fun number systems, the 'hypercomplex' numbers, of which the most popular seem to be the quaternions (with four dimensions) and octonions (with eight).

It seemed a good idea to ask about just ordinary, two-dimensional complex numbers, and find out if there was any use for them in this context, before inquiring about anything more complicated. :)

It's well-established that 0 decibans means 1:1 odds or 50% confidence; that 10 decibans means 10:1 odds; that -10 decibans means 1:10 odds; and that fractional numbers of decibans have similar meaning.

Does it make sense to talk about "i decibans", or "10 + 20i decibans"? If so, what does that actually mean?

I'm currently roughing out what may eventually become a formal specification for a protocol. It includes a numerical field for a level of confidence, measured in decibans. I'd like to know if I should simply define the spec as only allowing real numbers, or if there could be some purpose in allowing for complex numbers, as well.