riceissa

I am Issa Rice. https://issarice.com/

Comments

John Vervaeke

The EA Forum wiki has stubs for a bunch of people, including a somewhat detailed article on Carl Shulman. I wonder if you feel similarly unexcited about the articles there (if so, it seems good to discuss this with people working on the EA wiki as well), or if you have different policies for the two wikis.

Probability vs Likelihood

Ah ok, that makes sense. Thanks for clarifying!

Open & Welcome Thread – February 2021

It seems to already be on LW.

Edit: oops, looks like the essay was posted on LW in response to this comment.

Rationality Verification

I'm unable to apply this tag to posts (this tag doesn't show up when I search to add a tag).

Learn Bayes Nets!

For people who find this post in the future, Abram discussed several of the points in the bullet-point list above in Probability vs Likelihood.

Probability vs Likelihood

Regarding base-rate neglect, I've noticed that in some situations my mind seems to automatically do the correct thing. For example if a car alarm or fire alarm goes off, I don't think "someone is stealing the car" or "there's a fire". L(theft|alarm) is high, but P(theft|alarm) is low, and my mind seems to naturally know this difference. So I suspect something more is going on here than just confusing probability and likelihood, though that may be part of the answer.

Probability vs Likelihood

I understood all of the other examples, but this one confused me:

A scenario is likely if it explains the data well. For example, many conspiracy theories are very likely because they have an answer for every question: a powerful group is conspiring to cover up the truth, meaning that the evidence we see is exactly what they'd want us to see.

If the conspiracy theory really was very likely, then we should be updating on this to have a higher posterior probability on the conspiracy theory. But in almost all cases we don't actually believe the conspiracy theory is any more likely than we started out with. I think what's actually going on is the thing Eliezer talked about in Technical Explanation where the conspiracy theory originally has the probability mass very spread out across different outcomes, but then as soon as it learns the actual outcome, it retroactively concentrates the probability mass on that outcome. So I want to say that the conspiracy theory is both unlikely (because it did not make an advance prediction) and improbable (very low prior combined with the unlikeliness). I'm curious if you agree with that or if I've misunderstood the example somehow.

John Vervaeke

Thanks, I like your rewrite and will post questions instead in the future.

I think I understand your concerns and agree with most of it. One thing that does still feel "off" to me is: given that there seems to be a lot of in-person-only discussions about "cutting edge" ideas and "inside scoop" like things (that trickle out via venues like Twitter and random Facebook threads, and only much later get written up as blog posts), how can people who primarily interact with the community online (such as me) keep up with this? I don't want to have to pay attention to everything that's out there on Twitter or Facebook, and would like a short document that gets to the point and links out to other things if I feel curious. (I'm willing to grant that my emotional experience might be rare, and that the typical user would instead feel alienated in just the way you describe.)

Load More