No posts to display.
<p>Eliezer's novella provides a vivid illustration of the danger of promoting what should have stayed an instrumental value to the the status of a terminal value. Eliezer likes to refer to this all-too-common mistake as <a href="http://www.overcomingbias.com/2007/11/lost-purposes.html" rel="nofollo...(read more)
<p>Anna, it takes very little effort to rattle off a numerical probability -- and then most readers take away an impression (usually false) of precision of thought.</p>
<p>At the start of <i>Causality</i> Judea Pearl explains why humans (should and usually do) use "causal" concepts rather than "sta...(read more)
<p>Instead of describing my normative reasoning as guided by the criterion of <i>non-arbitrariness</i>, I prefer to describe it as guided by the criterion of minimizing or pessimizing algorithmic complexity. And that is a reply to steven's question right above: there is nothing unstable or logicall...(read more)
<p>Let me clarify that what horrifies me is the loss of potential. Once our space-time continuum becomes a bunch of supermassive black holes, it remains that way till the end of time. It is the condition of maximum physical entropy (according to Penrose). Suffering on the other hand is impermanen...(read more)
<blockquote>Richard, I'd take the black holes of course.</blockquote>
<p>As I expected. Much you (Eliezer) have written entails it, but it still gives me a shock because piling as much ordinary matter as possible into supermassive black holes is <i>the</i> most evil end I have been able to imagine...(read more)
<p>Question for Eliezer. If the human race goes extinct without leaving any legacy, then according to you, any nonhuman intelligent agent that might come into existence will be unable to learn about morality?</p>
<p>If your answer is that the nonhuman agent might be able to learn about morality i...(read more)
Speaking of compressing down nicely, that is a nice and compressed description of humanism. Singularitarians, question humanism.
<blockquote>trying to distance ourselves from, control, or delete too much of ourselves - then having to undo it.</blockquote>
<p>I cannot recall ever trying to delete or even control a large part of myself, so no opinion there, but "distancing ourselves from ourselves" sounds a lot like developing...(read more)
s/werewolf/Easter bunny/ IMHO.
<blockquote>Did that make sense?</blockquote>
<p>Yes, and I can see why you would rather say it that way.</p>
<p>My theory is that most of those who believe quantum suicide is effective assign negative utility to suffering and also assign a negative utility to death, but knowing that they will con...(read more)