•

Created by Vladimir_Nesov at

A likelihood ratio is the ratio of two probabilities. It is often used to compare two hypotheses or models to measure the relative strength of evidence supporting them.

It is used in the odds form of Bayes' theorem, the likelihood ratio is the relative probability of B being observed if hypothesis A is true, versus B being observed if hypothesis ¬A is true. Therefore, a Bayesian update can be calculated by converting the prior probability to odds, multiplying by the likelihood ratio, and converting the posterior odds back to probability. Knowing the probabilities for observing the evidence is unnecessary, only how many times more likely it is under one hypothesis than the other.

If the likelihood ratio is known, Bayesian updates are faster and more intuitive to calculate using the odds form. For example, if you know that A being true makes the observation of B twice as likely as when ¬A is true, the update can be calculated by converting the prior to odds, multiplying by two, and converting back. Additionally, if the prior is low, probability and odds can be approximated as each other (p=0.1 iff odds=0.111, and p=0.01 iff odds=0.0101), so the posterior probability can be approximated by skipping the conversion and simply multiplying by two.

## Talk:Likelihood ratio

Wikipedia article

Wikipedia:likelihood ratiois overly broad, and it starts with "In the frequentist statistics method of statistical hypothesis testing, the likelihood ratio...". I think including a link there is misleading for this concept. Maybe there is a subsection or another article on Wikipedia that fits better. --Vladimir Nesov20:41, 12 June 2009 (UTC)