I suspect he chose the bearded look originally because he looked young without it, many people (i.e. Alan Moore) explicitly choose it for those reasons. Now, I suspect he might shave it off altogether one day if he has a big breakthrough to publicize. The contrast would be quite impressive, even absent any actual technological intervention.
tl;dr it's the beard
Bump for this year's Petrov Day
Everyone's talking about this as if it was a hypothetical, but as far as I can tell it describes pretty accurately how hierarchical human civilizations tend to organize themselves once they hit a certain size. Isn't a divine ruler precisely someone who is more deserving and more able to absorb resources? Aren't the lower orders people who would not appreciate luxuries and indeed have fully internalized such a fact ("Not for the likes of me")
If you skip the equality requirement, it seems history is full of utilitarian societies.
After reading this post I came across this Bruce Lee quote which seemed in synch with the idea:
“I’ve always been buffeted by circumstances because I thought of myself as a human being affected by my outside conditioning. Now I realize that I am the power that commands the feeling of my mind and from which circumstances grow.”
I wonder if empirically and instinctively, Bruce had arrived at the same concept as this post explores.
Thanks for saving me from karmic hell, but I still don't see the conflict. I seem to follow the Vinge version, which doesn't appear to be proscribed.
I may have been too categorical, obviously one can make all the predictions he likes, and some with a high percentage of certainty, for instance "If cryorevival is possible then post singularity it will be trivial to implement" but that still doesn't give us any certainty that this will be so, for instance a post singularity paperclip maximizer would be capable of cryorevival but have no interest in it.
Depends on your objectives. If you believe the singularity is something that will happen regardless then it's harmless to spin scenarios. I gather that people like Elizier figure that the Singularity will happen unavoidably but that it can be steered towards optimum outcomes by setting down the initial parameters, in which case I suppose it's good to have an official line about "how things could be/how we want things to be"
God forbid someone might mistake our hypothetical discussions about future smarter than human artificial intelligences for science fiction.
And yet population nowadays is so much larger than in ancient times so there are claims the absolute number of slaves is currently higher than ever before
I've also encountered people who criticize the predictions surrounding the singularity, which misses the point that the singularity is the point beyond which predictions cannot be made.
edit: Didn't mean that as a comprehensive definition.
That is not the most common usage here. See Three Singularity Schools and the LW wiki page.
EDIT: The parent comment does not deserve to be at -4. This is a reasonable thing for an inexperienced commenter to say.
"first, do no harm"
It's remarkable that medical traditions predating transplants* already contain an injunction against butchering passers by for spare parts
*I thought this was part of the Hippocratic oath but apparently it's not
I agree - I think the original post is accurate in what people would respond to the suggestion, in abstract, but the actual implementation would undoubtedly hook vast swathes of the population. We live in a world where people already become addicted to vastly inferior simulations such as WoW already.
Indeed, in fact if many worlds is correct then for every second we are alive everything terrible that can possibly happen to us does in fact happen in some branching path.
In a universe that just spun off ours five minutes ago, every single one of us has been afflicted with sudden irreversible incontinence.
The many worlds theory has endless black comedy possibilities, I find.
edit: this actually reminds me of Granny Weatherwax in Lords and Ladies, when the Elf Queen threatens her with striking her blind, deaf and dumb she replies "You threaten me with... (read more)
I had thought of a similar scenario to put in a comic I was thinking about making. The character arrives in a society that has perfected friendly AI that caters to their every whim, but the people are listless and jumpy. It turns out their "friendly AI" is constantly making perfect simulations of everyone and running multiple scenarios in order to ostensibly determine their ideal wishes, but the scenarios often involve terrible suffering and torture as outliers.
I guess if you have the technology for it the "AI box" could be a simulation with uploaded humans itself. If the AI does something nasty to them, then you pull the plug
(After broadcasting "neener neener" at it)
This is pretty much the plot of Grant Morrison's Zenith (Sorry for spoilers but it is a comic from the 80s after all)
This is true, not only is it practical but it also makes a good rhetorical hammer, for example I once started an argument with a truther friend asking him what exactly he believed, "for instance, do you believe all the Jews were evacuated before the planes hit?". Forcing someone defending an irrational belief to first disassociate himself from all the really nutty stuff hanging on to his position works wonders.
I should probably remember to do more Socratic debating in friendly debates with incoming novices - never make a statement yourself if you can ask a question that will get the other person to make it.
Last night I was reading through your "coming of age" articles and stopped right before this one, which neatly summarizes why I was physically terrified. I've never before experienced sheer existential terror, just from considering reality.
My grasp of statistics is atrocious, something I hope to improve this year with an open university maths course, so apologies if this is a dumb question:
Do the figures change if you take "playing the lottery" as over the whole of your lifespan? I mean, most of the people I know who play the lottery make a commitment to play regularly. Is the calculation affected in any meaningful way? At least the costs of playing the lottery weekly over say 20 years become much less trivial in appearance
I think we're saying the same tihng - the singularity has happened inside the box, but not outside. It's not as if staring at stuff we can't understand for centuries is at all new in our history, it's more like business as usual...
But that's not an actual singularity since by definition it involves change happening faster than humans can comprehend. It's more of a contained singularity with the AI playing genie doling out advances and advice at a rate we can handle.
That raises the idea of a singularity that happens so fast that it "evaporates" like a tiny black hole would, maybe every time a motherboard shorts out it's because the PC has attained sentience and transcended within nanoseconds .
Incidentally, the Spanish inquisition did not believe in witches either, dismissing the whole thing as "female humours"
The fact is we as large complex mammals are already locked into a low rate of reproduction, sure given the right evolutionary pressures we could end up like shrews again, but that would take an asteroid strike or nuclear war, the scenario you're thinking of assumes long term evolution within a very long lasting stable society essentially like ours. In those circumstances genes for successful reproduction will spread through the population, but that's largely meaningless - if I have the gene for super attractiveness and manage to have 100 kids with 100 wom... (read more)
But the new universes also have their own population, though I guess you could colonize universes where humans don't arise rather than universes identical to this one except I didn't scratch my nose just now
How about simple spontaneous population stability... I live in a country with negative birth rate but the population is increasing due to immigration nevertheless. This state of affairs hasn't been legislated into existence, it just happened, and may be a natural behaviour of large human populations. Perhaps once the whole world reaches western standards of living the whole world will stop growing exponentially, with pockets of negative growth being compensated by low but positive growth in others... in the long term the trend could even be for decreasing population...
I guess I'll be back once I've read Permutation city then...
Although if we factor in consequences, say... being distracted by a dust speck in the eye while driving or doing any other such critical activity then statistically those trillions of dust specks have the potential to cause untold amounts of damage and suffering
There is a false choice being offered, because every person in every lifetime is going to experience getting something in their eye, I get a bug flying into my eye on a regular basis whenever I go running (3 of them the last time!) and it'll probably have happened thousands of times to me at the end of my life. It's pretty much a certainty of human experience (Although I suppose it's statistically possible for some people to go through life without ever getting anything in their eyes).
Is the choice being offered to make all humanities eyes for all eternity immune to small inconveniences such as bugs, dust or eyelashes? Otherwise we really aren't being offered anything at all.
But remember like Alan Moore said "The one place where gods exist unquestionably is the human mind". Similarly, narratives are a fact of not just your brain, but that of everyone around you, realizing this can be convenient.