Wiki Contributions


I suspect a sufficiently intelligent, unaligned artificial intelligence would both kill us all immediately, and immediately start expanding its reach in all directions of space at near light speed. There is no reason for there to be an either-or.

A different measure than IQ might be useful at some point. An IQ of X effectively means you would need a population of Y humans or more to expect to find at least one human with an IQ of X. As IQs get larger, say over 300, the number of humans you would need in a population to expect to find at least one human with such an IQ becomes ridiculous. Since there are intelligence levels that will not be found in human populations of any size, the minimum population size needed to expect to find someone with IQ X tends to infinity as IQ approaches some fixed value (say, 1000). IQ above that point is undefined.

It would be nice to find a new measure of intelligence that could be used to measure differences between humans and other humans, and also differences between humans and AI. But can we design such a measure? I think raw computing power doesn't work (how do you compare humans to other humans? Humans to an AI with great hardware but terrible software?)

Could you design a questionnaire that you know the correct answers to, that a very intelligent AI (500 IQ?) could not score perfectly on, but an extremely intelligent AI (1000+ IQ) could score perfectly on? If not, how could we design a measure of intelligence that goes beyond our own intelligence?

Maybe we could define an intelligence factor x to be something like: The average x value for humans is zero. If your x value is 1 greater than mine, then you will outwit me and get what you want 90% of the time, if our utility functions are in direct conflict, such that only one of us can get what we want, assuming we have equal capabilities, and the environment is sufficiently complex. With this scale, I suspect humans probably range in x-factors from -2 to 2, or -3 to 3 if we're being generous. This scale could let us talk about superintelligences as having an x-factor of 5, or an x-factor of 10, or so on. For example, a superintelligence with an x-factor of 5 has some chance of winning against a superintelligence with an x-factor of 6, but is basically outmatched by a superintelligence with an x-factor of 8.

The reason the "sufficiently complex environment" clause exists, is that superintelligences with x-factors of 10 and 20 may both find the physically optimal strategy for success in the real world, and so who wins may simply be down to chance. We can say an environment where there ceases to be a difference in the strategies between intelligences with an x-factor of 5 and and x-factor of 6 has a complexity factor of 5. I would guess the real world has a complexity factor of around 8, but I have no idea.

I would be terrified of any AI with an x-factor of 4-ish, and Yudkowsky seems to be describing an AI with an x-factor of 5 or 6.

It's funny, the text generated reminds me of babbling.

The dividend part made more sense - when people have more money, they can spend it on what's most urgent. And they know that, better than anyone else.

Yes! I apologize that my writing was a bit unclear, I didn't mean to advocate for specific legal rights such as a right to a decent home, but rather to advocate for a system under which everyone can afford a decent home, if they choose to buy one. That said, I'm not against some of these rights being enforced legally (Canadian here, and a huge fan of our healthcare system, excepting the fact that we don't include dental or eye care. Are my eyes and teeth not a part of my body?)

It is possible that other solutions would work for solving the problems I outline. Taxing companies more could be a benefit, though taxing companies does lead to a drag on the economy. Also, companies can move overseas, but land cannot. Attaching minimum wage to some measure of inflation I don't think would work, because landlords can eat the extra wages. I think rents are determined by what people can afford, AKA what landlords can get away with charging.

You're probably right about the word 'Manifesto'. I've changed this now.

(What is a spam list?)

A link would be nice, in case people reading this (the original post) haven't read that.

Good idea!

Sure, but it's a question of magnitudes. My claim is that what Joe Rogan is saying on his podcast has less impact on your life than the fact that the value of the land is being sucked up by landlords, rather than being shared by all. Of course, this isn't the only important issue facing our society, I just think it's one of the most important (aside from existential risks, probably), and that much less important issues only serve as a distraction from the more important ones.

Software: emacs

Need: Code editor (and personal information management system, and the only good git ui, and an email client, and...)

Other programs I've tried: Sublime Text, Atom, VsCode, vim

Why emacs is the best: Emacs can be whatever you want it to be. It can do everything and anything, all in one unified space where all your keybindings work, all your plugins work, etc. There is literally nothing you can't change about it, and people have created many "modes" for it that do a lot of things. In particular, org-mode renders all of those todo apps pointless, because it's way better, and really the only viable option for personal information management. If you would rather a ui for git than just use the command line, magit (an emacs mode) is also your only viable option.

Don't bother with it though if you don't have some time to invest in learning it (same goes for any powerful tool). I also use evil mode because more thought went into vim keybindings than emacs ones. Honestly, emacs feels kind of like an accident that's evolved over time to become amazing (think JavaScript), and so there are some terrible defaults and so on, but the roughness around the edges can be changed, so I'd recommend using Doom emacs to start, because they've already done the job of creating a good set of defaults.

There is, in fact, a sedative level, and higher doses aren't less effective, they just induce more side effects, from what I understand. I tried every dose under the sun, including tiny ones. The effect was always weak at best.

Load More