You shouldn't do that based on lack of computing power. And if you aren't doing it based on lack of computing power, why involve randomness at all?

Well, it's partially because sampling-based approximate inference algorithms are massively faster than real marginalization over large numbers of nuisance variables. It's also because using sampling-based inference makes all the expectations behave correctly in the limit while still yielding boundedly approximately correct reasoning even when compute-power is very limited.

So we beat the Mugging while also being able to have an unbounded utility function, because even in the limit, Mugging-level absurd possible-worlds can only dominate our decision-making an overwhelmingly tiny fraction of the time (when the sample size is more than the multiplicative inverse of their probability, which basically never happens in reality).

Importance sampling wouldn't have you ignore Pascal's Muggings, though. At its most basic, 'sampling' is just a way of probabilistically computing an integral.

Open thread, Dec. 15 - Dec. 21, 2014

by Gondolinian 1 min read15th Dec 2014309 comments


If it's worth saying, but not worth its own post (even in Discussion), then it goes here.

Previous Open Thread

Next Open Thread

Notes for future OT posters:

1. Please add the 'open_thread' tag.

2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)

3. Open Threads should be posted in Discussion, and not Main.

4. Open Threads should start on Monday, and end on Sunday.