Pyramid_Head3

Posts

Sorted by New

Comments

Measuring Optimization Power

And there goes Caledonian making pointless arguments again... Couldn't you pick a more frivolous objection?

Psychic Powers

Eliezer, what if psi phenomena are real, but they work through as-yet-unknown laws of physics? In this case reductionism could still be true (and probable), even if psi is real. I can't really see why psi phenomena rule out a reductionist universe (and I guess Damien Broderick agrees...).

By the way, I don't believe in psi, and think that all effects found thus far are based on the misapplication of statistics and related errors.

Qualitative Strategies of Friendliness

Eliezer: ...I'm seriously starting to wonder if some people just lack the reflective gear required to abstract over their background frameworks

I'm pretty sure of it, since I've seen some otherwise smart people make this kind of mistake (and I'm even more perplexed since I outgrew it right after my teenage years...)

Qualitative Strategies of Friendliness

@Caledonian: So if an AI wants to wipe out the human race we should be happy about it? What if it wants to treat as cattle? Which/whose preferences should it follow? (Notice the weasel words?)

When I was a teenager I used to think just like you. A superintelligence would have better goals than ordinary humans, because it is superintelligent. Then I grew up and realized that minds are not blank slates, and you can't just create a "value-free" AI and see what kinds of terminal values it chooses for itself.

Inseparably Right; or, Joy in the Merely Good

Good post, Eliezer. Now that I've read it (and the previous one), I can clearly see (I think) why you think CEV is a good idea, and how you arrived at it. And now I'm not as skeptical about it as I was before.

Contaminated by Optimism

Well, is there really no one else in the world right now to work in this problem along with Eliezer (who, in my opinion, don't lack discipline)? I can't help but think that it's rather arrogant...

Well, that's one of the reasons I'm not a SIAI donor, though. Can't donate money to someone who write blogs instead of researching Friendly AI theory. And I'm not nearly smart enough to make any progress on my own, or even help someone else. So I guess mankind is screwed :)

Contaminated by Optimism

@Kaj Sotala: I can't - I'm not smart enough :)

But seriously, do you really think that we ought to wait a decade before a brilliant researcher shows up? And it seems all the more suspicious because this brilliant researcher has to read Eliezer's material in a tender age, or else he won't be good enough.

Now don't get me wrong, I love Eliezer's posts here, and I've learned A LOT of stuff. And I also happen to think that he's onto something when he talks about Friendly AI (and AI in general). But I don't see how he can hope to save the world by writing blog posts...

Contaminated by Optimism

Eliezer,

You should either: a) ban Caledonian; b) let him write whatever he wants.

Censoring his posts is kind of nasty, because it looks like he can only express opinions you think worth posting. Personally, I think you should choose (a), because his comments are boring, disruptive and useless, but if you don't wanna do it, then go for (b).

And as for this: "Research help, in particular, seems to me to probably require someone to read all this stuff at the age of 15 and then study on their own for 7 years after that, so I figured I'd better get started on the writing now", I think it's kinda dumb, and it will never work out. If you keep hoping on something like that, somebody will get there first, and it probably won't be a Friendly outcome.

So you should plow ahead, and perhaps not be so arrogant as to think that no one else on the planet right now can help you with the research. There're plenty of smart guys out there, and if they have access to the proper literature, I'm sure you can find worthy contributors, instead of waiting 7 more years.

Changing Your Metaethics

I said it somewhere else, but... it seems like Caledonian’s sole purpose in life is to disagree with Eliezer whenever possible. Reminds me of a quote from Stephen King:

"These days if Stu Redman said a firetruck was red, Harold Lauder would produce facts and figures proving that most of them these days were green."

Just exchange Stu Redman for Eliezer, and Harold for Caledonian…

Math is Subjunctively Objective

Hmm, Eliezer likes Magic the Gathering (all five basic terrains?)...

Load More