Wiki Contributions

Comments

Aren’t turned off by perceived arrogance

One hypothesis I've had is that people with more MIRI-like views tend to be more arrogant themselves. A possible mechanism is that the idea that the world is going to end and that they are the only ones who can save is appealing in a way that shifts their views on certain questions and changes the way they think about AI (e.g. they need less explanation that they are some of the most important people ever, so they spend less time considering why AI might go well by default).

[ETA: In case it wasn't clear, I am positing subconscious patterns correlated with arrogance that lead to MIRI-like views]

How'd this go? Just searched LW for "neurofeedback" since I recently learned about it

We are very likely not going to miss out on alignment by a 2x productivity boost, that’s not how things end up in the real world. We’ll either solve alignment or miss by a factor of >10x.

Why is this true?

the genome can’t directly make us afraid of death

It's not necessarily direct, but in case you aren't aware of it, prepared learning is a relevant phenomenon,since apparently the genome does predispose us to certain fears

Seems like this guy has already started trying to use GPT-3 in a videogame: GPT3 AI Game Prototype

Not sure if it was clear, but the reason I asked was because it seems like if you think the fraction changes significantly before AGI, then the claim that Thane quotes in the top-level comment wouldn't be true.

Don't timelines change your views on takeoff speeds? If not, what's an example piece of evidence that updates your timelines but not your takeoff speeds?

Same - also interested if John was assuming that the fraction of deployment labor that is automated changes negligibly over time pre-AGI.

Humans can change their action patterns on a dime, inspired by philosophical arguments, convinced by logic, indoctrinated by political or religious rhetoric, or plainly because they're forced to.

I'd add that action patterns can change for reasons other than logical/deliberative ones. For example, adapting to a new culture means you might adopt and have new reactions to objects, gestures, etc that are considered symbolic in that culture.

Load More