LESSWRONGLW

Carlos Javier Gil Bellosta

Sorted by New

Wiki Contributions

This counterexample saga reminds me of Lakato's "Proofs and Refutations". You have a result that is "essentially true" but you can still find some "counterexamples" to it by conveniently stressing the "obvious setting" in which the result was originally formulated. Note in any case that whereas Euler has been "refuted" he is still credited for his original V - E + F = 2 formula.

I would only like to note that in the conception of probability of Jaynes, Keynes and others, it makes no sense to talk about P(A). They all assume that probabilities do not happen in the void and that you are always "conditioning" on some previous knowledge, B. So they would always write P(A|B) where other authors/schools just write P(A).

What I find most shocking about all this exponential vs linear discussion is how easily it gets us trapped into a [necessarily false] dichotomy. As a mathematician I am surprised that the alternative to an exponential curve be a line (why not a polynomial curve in between?).

The article mentions Lazard's levelized cost of energy report. I reverse engineered the spreadsheets on which the report results are based and put them online here in case somebody wants to recreate the scenarios, stress them differently or create new ones.

I believe that what Jaynes does is quite standard: start with a minimalistic set of axioms (or principles, or whatever) and work your way to the intiuitive results later on. Euclid geometry is just like that!

I just skimmed over the details of the proofs (and I am a mathematician by training!). I did not read Jaynes for such details. I just guess that if they were wrong, somebody would have already reported them. The meaty part is elsewhere.

I believe this entry could have been written in much more general terms, i.e., why using [Gaussian] approximations at all nowadays. There is one answer: get general, asymptotic results. But in practice, given the current status of computers and statistical software, there is no point in using approximations. Particularly, as they do not work for small samples, as the ones you mention. And, in practice, we need to deal with small samples as well.

The general advice would then be: if you need to model some random phenomenon, use the tools that allow to model it best. If beta, Poisson, gamma, etc. distributions seem more adequate, just do not use normal approximations at all.

First, I want to dispute the statement that a 50% is uninformative. It can be very informative depending on value of the outcomes. E.g., if I am analyzing transactions looking for fraud, that a transaction has 50% prediction of being fraudulent is "very informative": most fraudulent transactions may have fraud probabilities much, much lower than that.

Second, it is true that beliefs on probabilities need not be "sharp". The Bayesian approach to the problem (which is in fact the very problem that Bayes originally discussed!) would require you to provide a distribution of your "expected" (I want to avoid the terms "prior" or "subjective" explicitly here) probabilities. Such distribution could be more or less concentrated. The beta distribution could be used to encode such uncertainty; actually, it is the canonical distribution to do so. The question would remain how to operationalize it in a prediction market, particularly from the UX point of view.

There are many things to say about this result by N. Taleb. To start with, a minor detail: I's would have written $\hat{p} = I^{-1}_{1/2}(m+1, n - m)$, which is much more coherent with the fact that he is inverting the CDF.

He is inverting the CDF of a Beta distribution with parameters (m+1, n-m) which is a posterior in the Beta-Binomial model of a Beta(1, 0) distribution (!!!), with no explanation at all! It would have made slightly more sense to use a Beta(1, 1) instead.

Note that all he does by selecting q = 1/2 choosing as this "optimal estimate" the median of the Beta(m+1, n-m) distribution, i.e., the median of the posterior distribution.

Note that he ignores completely the base rate of 5%. Cannot he make use of it at all? So, even better than a Beta(1, 1), I'd have chosen the maximum entropy distribution among those betas with mean .05. I.e., one with a large variance; in fact, Taleb complains that the Bayesian approach provides funny results with highly informative beta priors.

If I had been facing the problem, I would have inquired about the distribution of those historical records whose aggregation is a 5% average and use it as a prior to model this new doctor.

All in all, I do not thing Taleb wrote his best page on that day. But he has many other great ones to learn from!

This is a topic I have found myself thinking a lot lately as well. I have found it useful to decompose a non-repeatable event (will X will the elections?) in two parts: one consisting of a combination of repeatable events and a "specific residual".

Let's start with a coin toss. It is, in a very Heraclitean sense, a one time event which we can decompose as a throw of an ideal coin plus a tiny, negligible "specific residual".

Let's go back to the problems in question. You could decompose them into combinations of events for which we have historical frequencies (how many times an incumbent politician...? how many times an election during an economic crises...? how do the probabilities of wining an election relate to the polls three months before...=), plus conceivably larger "specific residual" given the particularities of the question.

This approach is more useful than vague considerations on "probability is in your head" or "it just relates to information". It is actually how predictors work: decomposing the question into subquestions on which frequency considerations are easier to elicit, recombining them, and adding an extra layer on uncertainty on top.

I would advise you to have a look at philosopher Joseph Heath's work. He has a book, "The Efficient Society", where, according to Wikipedia, "[h]e argues that Canada's successes as a nation are largely attributable to its commitment to efficiency as a value". In "Morality, Competition, and the Firm" he also discusses the central role of efficiency on ethics (business' and beyond).