Review article on Bayesian inference in physics

by dvasya1 min read19th Sep 20112 comments

9

Personal Blog

A nice article just appeared in Reviews of Modern Physics. It offers a brief coverage of the fundamentals of Bayesian probability theory, the practical numerical techniques, a diverse collection of real-world examples of applications of Bayesian methods to data analysis, and even a section on Bayesian experimental design. The PDF is available here.

The abstract:

Rev. Mod. Phys. 83, 943–999 (2011)

Bayesian inference in physics

Udo von Toussaint* 
Max-Planck-Institute for Plasmaphysics, Boltzmannstrasse 2, 85748 Garching, Germany

Received 8 December 2009; published 19 September 2011

Bayesian inference provides a consistent method for the extraction of information from physics experiments even in ill-conditioned circumstances. The approach provides a unified rationale for data analysis, which both justifies many of the commonly used analysis procedures and reveals some of the implicit underlying assumptions. This review summarizes the general ideas of the Bayesian probability theory with emphasis on the application to the evaluation of experimental data. As case studies for Bayesian parameter estimation techniques examples ranging from extra-solar planet detection to the deconvolution of the apparatus functions for improving the energy resolution and change point estimation in time series are discussed. Special attention is paid to the numerical techniques suited for Bayesian analysis, with a focus on recent developments of Markov chain Monte Carlo algorithms for high-dimensional integration problems. Bayesian model comparison, the quantitative ranking of models for the explanation of a given data set, is illustrated with examples collected from cosmology, mass spectroscopy, and surface physics, covering problems such as background subtraction and automated outlier detection. Additionally the Bayesian inference techniques for the design and optimization of future experiments are introduced. Experiments, instead of being merely passive recording devices, can now be designed to adapt to measured data and to change the measurement strategy on the fly to maximize the information of an experiment. The applied key concepts and necessary numerical tools which provide the means of designing such inference chains and the crucial aspects of data fusion are summarized and some of the expected implications are highlighted.

© 2011 American Physical Society

9

2 comments, sorted by Highlighting new comments since Today at 11:04 PM
New Comment
[-][anonymous]10y 5

I haven't read it yet, but the abstract reminded me of something I was thinking about recently: Share Likelihoods, Not Posterior Beliefs really, really needs to get published somewhere mainstream.

Glad to see that Hamiltonian / Hybrid Monte Carlo is gaining more interest, though I guess it's supposed to have had more interest in physics than statistics for a while. Given how much better these algorithms scale, I think they should get a lot more attention. The description of Parallel tempering was pretty good, I had heard it described before but didn't get it. I get a sense for why it's exciting to people. The section on Reversible Jump MCMC made me realize I don't understand model selection problems very well at all.