Lichdar
Lichdar has not written any posts yet.

I think you are incorrect on dangerous use case, though I am open to your thoughts. The most obvious dangerous case right now, for example, is AI algorithmic polarization via social media. As a society we are reacting, but it doesn't seem like it is in an particularly effectual way.
Another way to see this current destruction of the commons is via automated spam and search engine quality decline which is already happening, and this reduces utility to humans. This is only in the "bit" universe but it certainly affects us in the atoms universe and as AI has "atom" universe effects, I can see similar pollution being very negative for us.
Banning seems hard, even for obviously bad use cases like deepfakes, though reality might prove me wrong(happily!) there.
Its not a myth, but an oversimplification which makes the original thesis much less useful. The mind, as we are care about, is a product and phenomenon of the entire environment it is in, as well as the values we can expect it to espouse.
It would indeed be akin to taking an engine, putting it in another environment like the ocean and expecting the similar phenomenon of torque to rise from it.
Lifelong quadriplegics are perfectly capable of love, right?
As a living being in need of emotional comfort and who would die quite easily, it would be extremely useful to express love to motivate care and indeed excessively so. A digital construct of the same brain would have immediately different concerns, e.g. less need for love and caring, more to switch to a different body, etc.
Substrate matters massively. More on this below.
Again, an perfect ideal whole-brain-emulation is a particularly straightforward case. A perfect emulation of my brain would have the same values as me, right?
Nope! This is a very common and yet widespread error, which I suppose comes from the idea that the mind... (read more)
But you do pass on your consciousness in a significant way to your children through education, communication and relationships and there is an entire set of admirable behaviors selected around that.
I generally am less opposed to any biological strategy, though the dissolution of the self into copies would definitely bring up issues. But I do think that anything biological has significant advantages in that ultimate relatedness to being, and moreover in the promotion of life: biology is made up of trillions of individual cells, all arguably agentic, which coordinate marvelously into a holobioant and through which endless deaths and waste all transform into more life through nutrient recycling.
I am in Vision 3 and 4, and indeed am a member of Pause.ai and have worked to inform technocrats, etc to help increase regulations on it.
My primary concern here is that biology remains substantial as the most important cruxes of value to me such as love, caring and family all are part and parcel of the biological body.
Transhumans who are still substantially biological, while they may drift in values substantially, will still likely hold those values as important. Digital constructions, having completely different evolutionary pressures and influences, will not.
I think I am among the majority of the planet here, though as you noted, likely an ignored majority.
I don't mind it: but not in a way that wipes out my descendants, which is pretty likely with AGI.
I would much rather die than to have a world without life and love, and as noted before, I think a lot of our mores and values as a species comes from reproduction. Immortality will decrease the value of replacement and thus, those values.
I want to die so my biological children can replace me: there is something essentially beautiful about it all. It speaks to life and nature, both which I have a great deal of esteem for.
That said, I don't mind life extension research but anything that threatens to end all biological life or essentially kill a human to replace it with a shadowy undead digital copy are both not worth it for it.
As another has mentioned, a lot of our fundamental values come from the opportunities and limitations of biology: fundamentally losing that eventually leads to a world without life, love or meaning. As we are holobioants, each change will have substantial downstream loss and likely not to a good end.
As far as I am concerned, immortality comes from reproduction and the vast array of behaviors around it are fundamentally beautiful and worthwhile.
I generally feel that biological intelligence augmentation, or a biosingularity is by far the best option and one can hope such enhanced individuals realize to forestall AI for all realistic futures.
With biology, there is life and love. Without biology, there is nothing.
Its not merely the rejection of God, its a story of "progress" to reject also reverence of nature and eventually, even life and reality itself, presumably so we can accept mass extinction for morally superior machines.
I had a very long writeup on this but I had a similar journey from identifying as a transhumanist to deeply despising AI, so I appreciate seeing this. I'll quote part of mine and perhaps identify:
"I worked actively in frontier since at least 2012 including several stints in "disruptive technology" companies where I became very familiar with the technology cult perspective and to a significant extent, identified. One should note that there is a definitely healthy aspect to it, though even the most healthiest aspect is, as one could argue, colonialist - the idea of destructive change in order to "make a better world."
Until 2023..
And yet I also had a deep and... (read 898 more words →)