Wiki Contributions

Comments

Yep, but of course the common opinion on Hacker News is that this is horrible.

I also find the wording of the saying unclear, and usually say, "eat your cake and still have it".

I don't usually comment on here, but I wanted to mention that my friend had his entire connected bank account drained by PayPal (by a third party, but PayPal did nothing about it), and that simply not holding a balance within PayPal is not enough. You have to close the PayPal account.

I can't provide evidence of this, but you can see similar stories online.

I suspect a sufficiently intelligent, unaligned artificial intelligence would both kill us all immediately, and immediately start expanding its reach in all directions of space at near light speed. There is no reason for there to be an either-or.

A different measure than IQ might be useful at some point. An IQ of X effectively means you would need a population of Y humans or more to expect to find at least one human with an IQ of X. As IQs get larger, say over 300, the number of humans you would need in a population to expect to find at least one human with such an IQ becomes ridiculous. Since there are intelligence levels that will not be found in human populations of any size, the minimum population size needed to expect to find someone with IQ X tends to infinity as IQ approaches some fixed value (say, 1000). IQ above that point is undefined.

It would be nice to find a new measure of intelligence that could be used to measure differences between humans and other humans, and also differences between humans and AI. But can we design such a measure? I think raw computing power doesn't work (how do you compare humans to other humans? Humans to an AI with great hardware but terrible software?)

Could you design a questionnaire that you know the correct answers to, that a very intelligent AI (500 IQ?) could not score perfectly on, but an extremely intelligent AI (1000+ IQ) could score perfectly on? If not, how could we design a measure of intelligence that goes beyond our own intelligence?

Maybe we could define an intelligence factor x to be something like: The average x value for humans is zero. If your x value is 1 greater than mine, then you will outwit me and get what you want 90% of the time, if our utility functions are in direct conflict, such that only one of us can get what we want, assuming we have equal capabilities, and the environment is sufficiently complex. With this scale, I suspect humans probably range in x-factors from -2 to 2, or -3 to 3 if we're being generous. This scale could let us talk about superintelligences as having an x-factor of 5, or an x-factor of 10, or so on. For example, a superintelligence with an x-factor of 5 has some chance of winning against a superintelligence with an x-factor of 6, but is basically outmatched by a superintelligence with an x-factor of 8.

The reason the "sufficiently complex environment" clause exists, is that superintelligences with x-factors of 10 and 20 may both find the physically optimal strategy for success in the real world, and so who wins may simply be down to chance. We can say an environment where there ceases to be a difference in the strategies between intelligences with an x-factor of 5 and and x-factor of 6 has a complexity factor of 5. I would guess the real world has a complexity factor of around 8, but I have no idea.

I would be terrified of any AI with an x-factor of 4-ish, and Yudkowsky seems to be describing an AI with an x-factor of 5 or 6.

It's funny, the text generated reminds me of babbling.

The dividend part made more sense - when people have more money, they can spend it on what's most urgent. And they know that, better than anyone else.

Yes! I apologize that my writing was a bit unclear, I didn't mean to advocate for specific legal rights such as a right to a decent home, but rather to advocate for a system under which everyone can afford a decent home, if they choose to buy one. That said, I'm not against some of these rights being enforced legally (Canadian here, and a huge fan of our healthcare system, excepting the fact that we don't include dental or eye care. Are my eyes and teeth not a part of my body?)

It is possible that other solutions would work for solving the problems I outline. Taxing companies more could be a benefit, though taxing companies does lead to a drag on the economy. Also, companies can move overseas, but land cannot. Attaching minimum wage to some measure of inflation I don't think would work, because landlords can eat the extra wages. I think rents are determined by what people can afford, AKA what landlords can get away with charging.

You're probably right about the word 'Manifesto'. I've changed this now.

(What is a spam list?)

A link would be nice, in case people reading this (the original post) haven't read that.

Good idea!

Load More