clone of saturn

Wiki Contributions

Comments

Sorted by

Why is it assumed that there's a dichotomy between expressing strength or creative genius and helping others? It seems like the truly excellent would have no problem doing both, and if the only way you can express your vitality is by keeping others in poverty, that actually seems kind of sad and pathetic and not very excellent.

Note that the continuity you feel is strictly backwards-looking; we have no way to call up the you of a year ago to confirm that he still agrees that he's continuous with the you of now. In fact, he is dead, having been destructively transformed into the you of now. So what makes one destructive transformation different from another, as long as the resulting being continues believing he is you?

From what I understand, they are using a forked version of Nitter which uses fully registered accounts rather than temporary anonymous access tokens, and sourcing those accounts from various shady websites that sell them in bulk.

Based on this comment I guess by "existing" you mean phenomenal consciousness and by "awareness" you mean behavior? I think the set of brainlike things that have the same phenomenal consciousness as me is a subset of the brainlike things that have the same behavior as me.

There seems to generally be a ton of arbitrary path-dependent stuff everywhere in biology that evolution hasn't yet optimized away, and I don't see a reason to expect the brain's implementation of consciousness to be an exception.

If it's immediate enough that all the copies end up indistinguishable, with the same memories of the copying process, then uniform, otherwise not uniform.

Answer by clone of saturn110

I think the standard argument that quantum states are not relevant to cognitive processes is The importance of quantum decoherence in brain processes. This is enough to convince me that going through a classical teleporter or copying machine would preserve my identity, and in the case of a copying machine I would experience an equal subjective probability of coming out as the original or the copy. It also seems to strongly imply than mind uploading into some kind of classical artificial machine is possible, since it's unlikely that all or even most of the classical properties of the brain are essential. I agree that there's an open question about whether mind emulation on any arbitrary substrate (like, for instance, software running on CMOS computer chips) preserves identity even if it shows the same behavior as the original.

Answer by clone of saturn20

You missed what I think would be by far the largest category, regulatory capture: jobs where the law specifically requires a human to do a particular task, even if it's just putting a stamp of approval on an AI's work. There are already a lot of these, but it seems like it would be a good idea to create even more, and add rate limits to existing ones.

A big difference is that assuming you’re talking about futures in which AI hasn’t catastrophic outcomes, no one will be forcibly mandated to do anything.

Why do you believe this? It seems to me that in the unlikely event that the AI doesn't exterminate humanity, it's much more likely to be aligned with the expressed values of whoever has their hands on the controls at the moment of no return, than to an overriding commitment to universal individual choice.

Load More