Most of the papers from the AGI-11 conference are now available online, including Yudkowsky's new paper: 'Complex Value Systems are Required to Realize Valuable Futures.'

Enjoy.

New Comment
9 comments, sorted by Click to highlight new comments since: Today at 12:42 AM

Mostly stuff we've heard before, although I'm sure it's useful to see it all in one place.

Eliezer presented this at the Winter Intelligence Conference.

http://www.vimeo.com/26914859

My response from last month is here.

it doesn't mean that efficient optimisers don't exhibit boredom. They do exhibit boredom - when exploring mostly-unexplored search spaces.

Isn't that like saying efficient optimizers experience hunger if/when they need to eat to live?

Also, consider rewriting the last paragraph. I think I don't understand it.

The word "exhibit" was intended as a reference to the associated behavioural manifestations - to things you can observe - such as stopping repetitious behaviour. Feel free to substitute "boredom-behaviours", for "boredom" if you find the original wording difficult to digest.

I'm surprised you linked to your old comment. For various reasons it looks like personal notes for writing a good post, not like something to inform or persuade. Consider redoing it.

Also, one suspects that for positions generally resembling the one I think you're taking, there is an error of equivocation or similar non-exactness. For example, between boredom as an experience and boredom behavior, or regarding whether or not boredom being partly non-instrumental (by your admission) is enough to save Yudkowsky from problems you bring up that are caused by boredom's instrumentality.

For such positions, there is a high standard of how clear the argument must be because it's difficult for me to keep track otherwise.

Also, one suspects that for positions generally resembling the one I think you're taking, there is an error of equivocation or similar non-exactness. For example, between boredom as an experience and boredom behavior [...]

The issue is whether creatures with purely instrumental boredom will create a "boring, worthless, valueless future".

Whether they have subjective boredom or just behavioural boredom seems like an irrelevant side-issue to me.

I have purely-instrumental boredom (to the best of my ability, of course).

Suggesting that purely-instrumental boredom will create a "boring, worthless, valueless future" is just a baseless criticism.

Yudkowsky's idea that such creatures will explore first and then get on with tedious exploiting makes little sense. For example, "exploring" the task of expanding through the galaxy (which is a universal instrumental value) is inevitably going to take a very long time - due to the scale of the experiments required.

strange that was downvoted with no explanation.

[-][anonymous]13y00

Comment on Accelerating Future (This is a claim I haven't encountered before):

Is anybody seriously arguing at this point that simple (trivial) goal systems will suffice for an AGI to work the way we want it >to? Yet this is the straw man that EY keeps attacking. Even Hibbard had complex goals in mind when he meant to keep >humans “happy”, although he did not communicate this well.