EDIT 17 Nov 2022: Retracted due to someone reminding me that both is not merely an option, but one with at least some precedent. Oops.

The following is just here for historical purposes now:


Context: In a recent interview with Kelsey Piper, Sam Bankman-Fried was asked if his "ethics stuff" was a front for something else:

[Kelsey:] you were really good at talking about ethics, for someone who kind of saw it all as a game with winners and losers

[SBF:] ya ... I had to be it's what reputations are made of, to some extent I feel bad for those who get fucked by it by this dumb game we woke westerners play where we say all the right shiboleths and so everyone likes us

One comment by Eli Barrish asked the question I'm now re-asking, to open a discussion:

The "ethics is a front" stuff: is SBF saying naive utilitarianism is true and his past messaging amounted to a noble lie? Or is he saying ethics in general (including his involvement in EA) was a front to "win" and make money? Sorry if this is super obvious, I just see people commenting with both interpretations. To me it seems like he's saying Option A (noble lie).

Let me be clear: this is an unusually important question that we should very much try to get an accurate, precise answer to.

EA as a movement is soul-searching right now, and we're trying to figure out how to prevent this, or something similar-but-worse, from happening again. We need to make changes, but which changes are still unknown.

To determine which changes to make, we need to figure out if this situation was: A. "naive utilitarian went too far", or B. "sociopath using EA to reputation-launder".

Both are extremely bad. But they require different corrections, lest we correct the wrong things (and/or neglect to correct the right things).

Note: I'm not using "sociopath" in the clinical sense, at least not checking for that usage, but more as the colloquial term for "someone who is chronically incapable of empathy / caring about others at the level of 'feeling sad when they feel sad'".

New Comment
4 comments, sorted by Click to highlight new comments since: Today at 7:44 PM

Why not both?

I have wondered if that's true, and if so, is it more like a merge, or more like a Jekyll/Hyde thing.

[epistemic status: amusing (to me) exploration.  not perfectly serious, but I don't completely disavow it either. ]

The naivety in simplistic Utilitarian assumptions is quite compatible with sociopathy.  The whole point is that the future is so much larger than the present, that small changes in future probabilities are worth large sacrifices in current experience.  This makes it VERY easy to convince yourself that your instrumental power increase benefits trillions of future humans, justifying 'most any behavior you want.

Part of what utilitarianism is about is that you are not supposed to make decisions because you feel sad when others feel sad but instead of following emotions make rational calculations. 

If your point is "most humans who are utilitarianism don't really follow through on rational calculation but let emotions drive them, so there's no problem", it would make sense to make it more explicit. 

There's no difference between applying the principles of utilitarianism to the book and being a sociopath in your definition.

Let's say you have the scenario: 'There's on person who's alive and suffering, you can either help them or help 1 trillion future people and you don't feel any emotional empathy toward those 1 trillion future people'. Who should you help? Normal people decide to help the one person who's alive and suffering because they have empathy for them. The longtermist pitch, on the other hand, suggests that this is wrong and future people should be valued just as much as present people and people should not lead themselves to get blinded by that empathy.