Transhumanism thread in progress at Reddit

by Document1 min read25th Nov 20105 comments


Personal Blog

Starting with this reply to "You were born too soon":

> depending on when exactly we achieve this, this could be the best time to be born ever, because it will be the absolute earliest anybody will have achieved immortality. Someone born within 20 years of this moment could one day be the oldest human, sentient, or even living being in the Universe.

The comments are currently split between arguing and agreeing with this. So far, no mention of cryonics. One post presents a possibly interesting technical argument that our current knowledge/technology is centuries away from mind uploading/whole-brain emulation.

(Also posted to The Singularity in the Zeitgeist, but that thread seems to have been mostly forgotten.)

5 comments, sorted by Highlighting new comments since Today at 10:51 PM
New Comment
[-][anonymous]10y 6

Possibly relevant: I am my connectome It's already possible to model brains on the neuronal level -- to compute a graph of which neurons are connected to which others. The problem is scale: we've already completed C. elegans, we're still a few years away from a mouse connectome, and who knows about the human connectome.

Microscope resolution isn't the problem. The problem is the computing power required to get from tiny 3d cubes of microscopic images, to a model graph of where the neurons are -- it's an expensive image recognition problem.

I am my connectome

That is about as true as "a city is its street map." A street map may uniquely correspond to a real city, but it's still a type error, and there is the strong possibility that significant information is not contained in what got put into the map.

The technical argument is referring to technologies that can scan a living, working brain, I think. If you're talking about frozen brains that you can slice up, there's a decent chance that the resolution of existing scanning electron microscopes could be good enough for brain emulation - the big problem is the sheer volume of material to be scanned.

Frankly, to count as a possibly interesting technical argument against mind uploading it would have to be written explicitly in response to the Whole Brain Emulation Roadmap which discusses all these issues (not just scanning but the "jigsaw puzzle" issue) in detail.

Update: Another one's made the front page; Harvard scientists reverse the ageing process in mice – now for humans. They seem to come up pretty regularly if you follow the front page, actually. Some of the immediately visible futurist bait from this one (with similar sentiments repeated throughout it):

  • In response to a poster's admittedly confused-sounding plan to "train" an array of neuromorphic chips to emulate his brain: "A high-fidelity copy of you will exist. It will behave more or less just like you, but it's not the you inside of your head." (c177ia2) It's not clear whether they're debating a mystical notion of continuity or
  • "So that the likes of Donald Trump can live forever? Meh." (c176ngs)
  • "Step 3: Realize 300 years in that there really is no point in living to 1000, and that the world we live in actually kind of sucks. Die peacefully at 400." (c176e7c)
  • "Chances are in a million years you'll get hit by a bus. There is no immortality." (c176hth)
  • "Eradicating the man-made causes of cancer will not prevent cancer. It probably won't even make a statistically significant difference. You have to eradicate it entirely. Oh, by the way, that's impossible - sorry." (c176wt1)

Some of those have posts arguing against them, but some of those posts are embarrassingly clumsy; for instance, the prospect of global wireheading is repeatedly brought up as an argument against deathism. I don't feel like looking deeper at the moment.

There's also the thread Do We Really Want Immortality? David Brin, Ph.D. in progress at /r/scifi with a similar level of discourse, but it hasn't been frontpaged.

It's not clear whether they're debating a mystical notion of continuity or

...or trying to point out that in a particular scenario, he'd have (from his present self's perspective) at best 50% anticipation of traditional survival, and that there'd be an instance of him at one point that had no anticipation of survival.