The ongoing transformation of quantum field theory

by Mitchell_Porter4 min read29th Dec 201219 comments

31

Physics
Personal Blog

Quantum field theory (QFT) is the basic framework of particle physics. Particles arise from the quantized energy levels of field oscillations; Feynman diagrams are the simple tool for approximating their interactions. The "standard model", the success of which is capped by the recent observation of a Higgs boson lookalike, is a quantum field theory.

But just like everything mathematical, quantum field theory has hidden depths. For the past decade, new pictures of the quantum scattering process (in which particles come together, interact, and then fly apart) have incrementally been developed, and they presage a transformation in the understanding of what a QFT describes.

At the center of this evolution is "N=4 super-Yang-Mills theory", the maximally supersymmetric QFT in four dimensions. I want to emphasize that from a standard QFT perspective, this theory contains nothing but scalar particles (like the Higgs), spin-1/2 fermions (like electrons or quarks), and spin-1 "gauge fields" (like photons and gluons). The ingredients aren't something alien to real physics. What distinguishes an N=4 theory is that the particle spectrum and the interactions are arranged so as to produce a highly extended form of supersymmetry, in which particles have multiple partners (so many LWers should be comfortable with the notion).

In 1997, Juan Maldacena discovered that the N=4 theory is equivalent to a type of string theory in a particular higher-dimensional space. In 2003, Edward Witten discovered that it is also equivalent to a different type of string theory in a supersymmetric version of Roger Penrose's twistor space. Those insights didn't come from nowhere, they explained algebraic facts that had been known for many years; and they have led to a still-accumulating stockpile of discoveries about the properties of N=4 field theory.

What we can say is that the physical processes appearing in the theory can be understood as taking place in either of two dual space-time descriptions. Each space-time has its own version of a particular large symmetry, "superconformal symmetry", and the superconformal symmetry of one space-time is invisible in the other. And now it is becoming apparent that there is a third description, which does not involve space-time at all, in which both superconformal symmetries are manifest, but in which space-time locality and quantum unitarity are not "visible" - that is, they are not manifest in the equations that define the theory in this third picture.

I cannot provide an authoritative account of how the new picture works. But here is my impression. In the third picture, the scattering processes of the space-time picture become a complex of polytopes - higher-dimensional polyhedra, joined at their faces - and the quantum measure becomes the volume of these polyhedra. Where you previously had particles, you now just have the dimensions of the polytopes; and the fact that in general, an n-dimensional space doesn't have n special directions suggests to me that multi-particle entanglements can be something more fundamental than the separate particles that we resolve them into.

It will be especially interesting to see whether this polytope combinatorics, that can give back the scattering probabilities calculated with Feynman diagrams in the usual picture, can work solely with ordinary probabilities. That was Penrose's objective, almost fifty years ago, when he developed the theory of "spin networks" as a new language for the angular momentum calculations of quantum theory, and which was a step towards the twistor variables now playing an essential role in these new developments. If the probability calculus of quantum mechanics can be obtained from conventional probability theory applied to these "structures" that may underlie familiar space-time, then that would mean that superposition does not need to be regarded as ontological.

I'm talking about this now because a group of researchers around Nima Arkani-Hamed, who are among the leaders in this area, released their first paper in a year this week. It's very new, and so arcane that, among physics bloggers, only Lubos Motl has talked about it.

This is still just one step in a journey. Not only does the paper focus on the N=4 theory - which is not the theory of the real world - but the results only apply to part of the N=4 theory, the so-called "planar" part, described by Feynman diagrams with a planar topology. (For an impressionistic glimpse of what might lie ahead, you could try this paper, whose author has been shouting from the wilderness for years that categorical knot theory is the missing piece of the puzzle.)

The N=4 theory is not reality, but the new perspective should generalize. Present-day calculations in QCD already employ truncated versions of the N=4 theory; and Arkani-Hamed et al specifically mention another supersymmetric field theory (known as ABJM after the initials of its authors), a deformation of which is holographically dual to a theory-of-everything candidate from 1983.

When it comes to seeing reality in this new way, we still only have, at best, a fruitful chaos of ideas and possibilities. But the solid results - the mathematical equivalences - will continue to pile up, and the end product really ought to be nothing less than a new conception of how physics works.

31

19 comments, sorted by Highlighting new comments since Today at 3:39 AM
New Comment

Hmm, I wish someone would summarize this summary in a language accessible to a Physics PhD in an area other than the String Theory.

TL;DR: you can make a lot of maps between physically unrealistic theories. There are hopes that in one of these mappings, scattering might be easier to compute, or at least easier to comprehend. If this works, there are further hopes that it can be generalized to actual physical theories.

First, the summary begins by summarizing some dualities- If you take a toy-model of particle physics with a whole lot of symmetries (super-symmetric Yang-Mills theory) but in 3+1 dimensions you can play some mathematical games. It turns out super-symmetric Yang Mills without gravity is equivalent to a different 5 dimensional theory WITH gravity (a type of string theory).

Similarly, there is a duality between some "simplified" string theories (of a different type) and twistor theory (Twistor theory maps the geometric objects in Minkowski space on to different geometric objects in 'twistor space', which is a space with a metric with a (2,2) signature).

Finally, the recent paper proposes a new dual structure, which like twistor theory maps the geometric objects in Minkowski space onto another sort of space. In this new space scattering events can be described by polytopes.

Of course, this is all probably worthless to phenomenologists and theorists who actually want to predict the results of particle experiments- super-symmetric yang mills theory doesn't describe any actual physical system.

The twistor string gave rise to "BCFW recursion relations" for gauge theories that are now the basis of many practical calculations, notably to model QCD processes at the LHC, the background against which anything new will be detected.

The Grassmannian reformulation of gauge theory in the new paper is a continuation of that research program, and the authors expect it to be generally valid - see page 137, third paragraph.

The only calculations I've seen referenced in actual releases from the LHC are either parton-shower calculated backgrounds (pythia), leading-order background (Madgraph,CalcHEP), or at most NLO (MCFM, etc). The automated NLO stuff will probably be soon done with BlackHat, which uses the standard unitarity method that Dixon and Kosower came up with to do the loops. So as far as experiments go, the BCFW relations aren't really used to do QCD backgrounds. Please point me to a reference, if I've missed it.

I left physics for greener pastures after my postdoc and have been working as a statistician for a few years now, but certainly in the first several years of the BCFW recursions, people weren't doing that much with them. A few fun results for pure gluon amplitudes that were difficult to integrate into the messy world of higher-order QCD calculations (how do you consistently parton-shower when your gluon processes are at all order, and your quark processes are at NLO?), and that was about it.

The most practical use of BCFW that I have found is in arxiv:1010.3991; if you read pages 3 and 4 closely, you'll see that BCFW was used to construct N=4 amplitudes which were then transposed to QCD and used to calculate a "W + 4 jets" background. I take your point that, although there are theorists using BCFW to model LHC physics, including some from BlackHat, the LHC teams themselves still do their in-house calculations using other methods. Though I think of unitarity cuts as another part of the same big transformation as BCFW.

An article about the underlying math with a very accessible introduction (using lots of illustrative graphs) can be found on Arxiv:
"Scattering Amplitudes and the Positive Grassmannian" by N. Arkani-Hameda at al http://arxiv.org/pdf/1212.5605v1.pdf

Quantum mechanical amplitudes are almost exactly as mathematically simple as classical probabilities. (Despite the fact that we developed math while observing classical probabilities but not quantum amplitudes! That seems like pretty robust evidence.)

A theory that gets rid of quantum mechanical amplitudes and replaces them by classical probability doesn't get simpler, it just gets more like human experience. We are already pretty certain that particle physics does obey quantum probability, and we see pretty well how humans' experience of classical probability emerges from the picture (modulo the hard problem of consciousness), so we know that humans' experience of classical probability is irrelevant as evidence about the underlying physical facts (as are the metaphysical intuitions derived from this experience).

(I mention these considerations both as evidence against the prospects of clean physical theories based on classical probability, and as arguments against the worldview according to which such theories are metaphysically attractive, which seems to run through your writing much more generally.)

At any rate, superposition already doesn't need to be regarded as fundamental, if you are happy on giving up on notions like "locality." There is room for interesting math here, but I don't see the metaphysical relevance.

At any rate, superposition already doesn't need to be regarded as fundamental, if you are happy on giving up on notions like "locality."

Locality is implicit in special relativity- in a sense being happy "giving up notions like "locality"" is the same as "if you are happy ignoring the way that all modern physical theories work" (relativity and quantum field theory.)

Quantum mechanical amplitudes are almost exactly as mathematically simple as classical probabilities.

Yes, but this is about physics- the big question isn't about whether things are mathematically simple. The big question is why do quantum amplitudes show up in our experiments only as classical probabilities? Answering that question would almost certainly revolutionize our understanding of physics.

Yes, my point was that giving up on locality is fairly ridiculous, but is necessary to get rid of superposition.

We understand why the evolution of classical systems is governed by classical probabilities---just churn through the quantum mechanics. Decoherence is very simple (e.g., if you make your reversible computation irreversible by creating heat, the computer looks like it is governed by classical probabilities). As to why we experience branches with high L2 mass, or why we experience anything at all, indeed that looks like a hard question (though I strenuously object to finding "high L2 mass" more surprising than "high probability," since that is quite clearly an artifact of human intuitions).

We understand why the evolution of classical systems is governed by classical probabilities---just churn through the quantum mechanics. Decoherence is very simple...

Decoherence isn't actually enough to show why quantum amplitudes show up as classical probabilities- if it were the Born-amplitude problem in many worlds would be solved. You need assumptions to turn "the wavefunction looks like this" into "the wavefunction looks like this so we expect to see result A with probability whatever." Decoherence tells us off-diagonal elements in the density matrix aren't likely to survive interaction with a larger system- thats not enough to connect to experimental values.

You jump straight from decoherence to "experiencing branches" without defining what you mean by "branch. "

So it's enough to establish a quasi-classical preferred basis, but you still have the Born Rule problem? But one is solved ?

My point was that we have no uncertainty about any physical processes involved, only about why we experience what we do. We aren't uncertain about why classical computers are classical, or why L2-typical observers would experience classical probabilities.

The fact that you experience the outcomes of physical experiments at all requires explanation. I don't see why that explanation is easier if you use probabilities or counting measure (especially given an infinite universe) rather than amplitude. It seems like bad form to absorb confusion about the hard problem of consciousness into confusion about physics, given that I at least cannot imagine any physics that would resolve my confusion about consciousness.

I really like academic summary articles; I would love to see more of these in physics and math. I think there are a good number in cognitive science and occasionally philosophy or computer science or biotech, and it's one of the things I regularly enjoy reading here. Expanding the purview too much might be slightly off topic to Less Wrong, I don't know if people would mind--learning lots about science seems reasonably connected to rationality but most specifics will be less closely related.

Also, the link to Motl's blog is broken (it puts the Less Wrong domain in front).

Can anyone here recommend blogs or other sources for posts similar to this one? Not necessarily in physics - math, biology, etc. would be welcome too. I'd love to find more places that don't mind giving real details while still making some points that can be picked up by a an audience that is generally well-informed but without huge domain knowledge.

Research Blogging is an aggregator for science blog posts that discuss peer-reviewed research in some detail. One downside is that it only collects blog posts that explicitly ping it, so it misses out bloggers who don't do this (like this economics paper summarizer).

[-][anonymous]9y 8

Physics is cool and all, but what has this got to do with rationality?

Mitchell's post is about new developments in the foundations of physics (i.e. what the foundation ontology of the physical world should be). This is very much relevant to Eliezer Yudkowsky's latest sequence on epistemology: for instance, if Mitchell Porter is right, we should be able to reconcile Eliezer's viewpoint that everything should in principle be reducible to physics with the traditional neo-positivist POV that everything should be reducible to experience. All things considered, this is a fairly big deal.

[-][anonymous]9y 10

That sounds sensible, but even on rereading, it is entirely unclear to me where the OP said anything about that. Maybe I just lack the physics grooves in my brain that would help me grok this, but I can't be the only one.

If that is in fact the point of this post, I think it could vastly improved by saying at the outset "here's this new hypothetical physics that physicists are playing with that is interesting to us because X" and then discussing the implications rather than just the physics and math.

and if X really is something about reconciling physical reductionism with positivism, then I'm totally confused, because those things don't seem to need reconciling.

I think bogus is talking about this passage:

If the probability calculus of quantum mechanics can be obtained from conventional probability theory applied to these "structures" that may underlie familiar space-time, then that would mean that superposition does not need to be regarded as ontological.

which seems to be suggesting that the approach may eventually solve the puzzle of Born probabilities. I agree with you that it would have been nice to have this at the top of the post, if it's supposed to be the main point.