Top postsTop post
momom2
Message
AIS student, self-proclaimed aspiring rationalist, very fond of game theory.
"The only good description is a self-referential description, just like this one."
327
7
163
Epistemic status: shower thoughts. I am currently going through the EA Introductory Course and we discussed two arguments against longtermism which I have not seen elsewhere. So goes a thought experiment: imagine you have toxic waste at hand, which you can process right now at the cost of 100 lives,...
TL;DR: A series of valid bounded arguments all arguing the same proposition can only provide as much evidence as that proposition at best, so even if it looks like they're piling up in favor of a higher point, they're only as good as the most central of them. Epistemic status:...
(Asking for a project which involves collecting data from the major AI companies.) Potential ideas found during a brainstorm: - Threshold on number of parameters of biggest model produced. - Threshold on number of money invested in AI research. - Threshold on revenue. We're looking for a criterion that is...
This document was made as part of my internship at EffiSciences. Thanks to everyone who helped review it, in particular Charbel-Raphaël Segerie, Léo Dana, Jonathan Claybrough and Florent Berthet! Introduction Clarifying AI X-risk is a summary of a literature review of AI risk models by DeepMind that (among other things)...
In Hero Licensing, Eliezer Yudkowsky states that in 2010, he would have given himself 10% chance of HPMoR being very successful. He then goes on to explain why he thought that instead of something lower; but I don't understand why he thought that instead of something higher: given that HPMoR...
In "Adaptation-Executers, not Fitness-Maximisers", Eliezer Yudkowsky writes: "Fifty thousand years ago, the taste buds of Homo sapiens directed their bearers to the scarcest, most critical food resources—sugar and fat. Calories, in a word. Today, the context of a taste bud's function has changed, but the taste buds themselves have not....
When searching for ideas and arguments for a post I intend to write on theism, I came upon a comment by Scott Alexander who claimed that many theists would change their mind if you could convince them on a gut-level that there could exist a godless moral world. I introspected...