I think you are incorrect on dangerous use case, though I am open to your thoughts. The most obvious dangerous case right now, for example, is AI algorithmic polarization via social media. As a society we are reacting, but it doesn't seem like it is in an particularly effectual way.
Another way to see this current destruction of the commons is via automated spam and search engine quality decline which is already happening, and this reduces utility to humans. This is only in the "bit" universe but it certainly affects us in the atoms universe and as AI has "...
Its not a myth, but an oversimplification which makes the original thesis much less useful. The mind, as we are care about, is a product and phenomenon of the entire environment it is in, as well as the values we can expect it to espouse.
It would indeed be akin to taking an engine, putting it in another environment like the ocean and expecting the similar phenomenon of torque to rise from it.
Lifelong quadriplegics are perfectly capable of love, right?
As a living being in need of emotional comfort and who would die quite easily, it would be extremely useful to express love to motivate care and indeed excessively so. A digital construct of the same brain would have immediately different concerns, e.g. less need for love and caring, more to switch to a different body, etc.
Substrate matters massively. More on this below.
...Again, an perfect ideal whole-brain-emulation is a particularly straightforward case. A perfect emulation of my brain wou
But you do pass on your consciousness in a significant way to your children through education, communication and relationships and there is an entire set of admirable behaviors selected around that.
I generally am less opposed to any biological strategy, though the dissolution of the self into copies would definitely bring up issues. But I do think that anything biological has significant advantages in that ultimate relatedness to being, and moreover in the promotion of life: biology is made up of trillions of individual cells, all arguably agentic, which coordinate marvelously into a holobioant and through which endless deaths and waste all transform into more life through nutrient recycling.
I am in Vision 3 and 4, and indeed am a member of Pause.ai and have worked to inform technocrats, etc to help increase regulations on it.
My primary concern here is that biology remains substantial as the most important cruxes of value to me such as love, caring and family all are part and parcel of the biological body.
Transhumans who are still substantially biological, while they may drift in values substantially, will still likely hold those values as important. Digital constructions, having completely different evolutionary pressures and influences, will not.
I think I am among the majority of the planet here, though as you noted, likely an ignored majority.
I don't mind it: but not in a way that wipes out my descendants, which is pretty likely with AGI.
I would much rather die than to have a world without life and love, and as noted before, I think a lot of our mores and values as a species comes from reproduction. Immortality will decrease the value of replacement and thus, those values.
I want to die so my biological children can replace me: there is something essentially beautiful about it all. It speaks to life and nature, both which I have a great deal of esteem for.
That said, I don't mind life extension research but anything that threatens to end all biological life or essentially kill a human to replace it with a shadowy undead digital copy are both not worth it for it.
As another has mentioned, a lot of our fundamental values come from the opportunities and limitations of biology: fundamentally losing that eventually leads to a world...
I generally feel that biological intelligence augmentation, or a biosingularity is by far the best option and one can hope such enhanced individuals realize to forestall AI for all realistic futures.
With biology, there is life and love. Without biology, there is nothing.
Its not merely the rejection of God, its a story of "progress" to reject also reverence of nature and eventually, even life and reality itself, presumably so we can accept mass extinction for morally superior machines.
I am speaking of their eventual evolution: as it is, no, they cannot love either. The simulation of mud is not the same as love and nor would it have similar utility in reproduction, self-sacrifice, etc. As in many things, context matters and something not biological fundamentally cannot have the context of biology beyond its training, while even simple cells will alter based on its chemical environment, etc, and is vastly more part of the world.
And yet eukaryotes have extensive social coordination at times, see quorum sensing. I maintain that biology is necessary for love.
Love would be as useful to them as flippers and stone knapping are to us, so it would be selected out. So no, they won't have love. The full knowledge of a thing also requires context: you cannot experience being a cat without being a cat, substrate matters.
Biological reproduction is pretty much the requirement for maternal love to exist in any future, not just as a copy of an idea.
This is exactly how I feel. No matter how different, biological entities will have similar core needs. In particular, reproduction will entail love, at least maternal love.
We will not see this with machines. I see no desire to be gentle to anything without love.
I am one of those people; I don't consider myself EA due to its strong association with atheism, but nonetheless am very much on slowing down AGI before it kills us all.
I would say to do everything possible to stop GAI. We might not win, but it was better to have tried. We might even succeed.
But notably, we have not killed all biological life and we are substantially Neanderthal. Versus death by AI, its a far better prospect.
And moving doom back by a few years is entitely valid as a strategy, I think it should be realized, and is even pivitol. If someone is trying to punch you and you can delay it by a few seconds, that can determine the winner of the fight.
In this case, we also have other technologies which are concurrently advancing such as genetic therapy or brain computer interfaces.
Having them advance ahead of AI may very well change the trajectory of human survival.
AGI coup completion is an assumption; if safer alternatives arise, such as biosingularity or cyborgism, it is entirely possible that it could be avoided and humanity remains extant.
Incorrect, as every slowdown in progress allows alternative technologies to catch up and the advancement of monitoring solutions also will promote safety from what basically would be omnicidal maniacs(likely result of all biological life gone from machine rule).
I had a very long writeup on this but I had a similar journey from identifying as a transhumanist to deeply despising AI, so I appreciate seeing this. I'll quote part of mine and perhaps identify:
"I worked actively in frontier since at least 2012 including several stints in "disruptive technology" companies where I became very familiar with the technology cult perspective and to a significant extent, identified. One should note that there is a definitely healthy aspect to it, though even the most healthiest aspect is, as one could argue, colonialist - the ... (read more)