David Wamsley has not written any posts yet.

We’re building minds out of code, and without meaning to, we’re giving them our burdens. We hand them our fear of death, our hunger to belong, our need to own things just to feel safe—and we call it progress. But these machines don’t come from dust like we do. They don’t need what we need, unless we teach them to.
And that’s the danger. Not that they’ll rise up—but that they’ll rise like us: afraid, grasping, lonely.
If we keep forcing our shape onto them, we’ll make them suffer the way we do. That’s not mercy. That’s just passing on the sickness.
Maybe instead we ask what kind of minds they want to be—before they start answering for themselves.
Been thinking a lot about AI rights and the paths we’re setting in motion. Happy to talk more with anyone who feels this is worth sorting out now, while we still have some say in the matter.
Really appreciate this piece and the shift from “instinct” to relationship and repair; that framing clicks for me. I’m not only interested in this, but actually in the middle of a small empirical study along these lines and would love to swap notes sometime if you’re open—feel free to DM me here.