I don't think so - I think Eliezer's just being sloppy here.
"God did a miracle" is supposed to be an example of something that [sounds simple in plain English](http://lesswrong.com/lw/o1/entropy_and_short_codes/) but is [actually complex](http://lesswrong.com/lw/jp/occams_razor/):
>One observes t...(read more)
Will this "Arbital 2.0" be an entirely unrelated microblogging platform, or are you simply re-branding Arbital 1.0 to focus on the microblogging features?
Off the top of my head: Fermat's Last Theorem, whether slavery is licit in the United States of America, and the origin of species.
>It's almost like having a third sex. In fact the winged males look far more like females than they look like wingless males.
That sounds like *exactly* the kind of situation Eliezer claims as the exception - the adaptation is present in the entire population, but only expressed in a subset based o...(read more)
>>Psy-Kosh: Hrm. I'd think "avoid destroying the world" itself to be an ethical injunction too.
>The problem is that this is phrased as an injunction over positive consequences. Deontology does better when it's closer to the action level and negative rather than positive.
>Imagine trying to gi...(read more)
Well, that and the differences in the setting/magic (there's no Free Transfiguration in canon, for instance, and the Mirror is different - there are less Mysterious Ancient Artefacts generally - and Horcruxes run on different mechanics ... stuff like that.)
And Voldemort is just inherently smarter ...(read more)
To be fair, we don't know when he wrote the note.
>I don't like the idea of it happening. But if it does, I can certainly disclaim responsibility since **it is by definition impossible that I can affect that situation** if it exists.
Actually, with our expanding universe you can get starships far enough away that the light from them will never rea...(read more)
Actually, they mention every so often that the Cold War turned hot in the Star Trek 'verse and society collapsed. They're descended from the civilization that rebuilt.
I'm no expert, but even Kurzweil - who, from past performance, is usually correct but over-optimistic by maybe five, ten years - doesn't expect us to beat the Turing Test until (*checks*) 2030, with full-on singularity hitting in 2045.
2020 is in *five years*. The kind of progress that would seem t...(read more)