All of Bugle's Comments + Replies

Interview with Aubrey de Grey, chief science officer of the SENS Research Foundation

I suspect he chose the bearded look originally because he looked young without it, many people (i.e. Alan Moore) explicitly choose it for those reasons. Now, I suspect he might shave it off altogether one day if he has a big breakthrough to publicize. The contrast would be quite impressive, even absent any actual technological intervention.

tl;dr it's the beard

1emanuele ascani2yAbsolutely no one had thought of that in the YouTube comment section under his interview with JRE
9/26 is Petrov Day

Bump for this year's Petrov Day

SMBC comic: poorly programmed average-utility-maximizing AI

Everyone's talking about this as if it was a hypothetical, but as far as I can tell it describes pretty accurately how hierarchical human civilizations tend to organize themselves once they hit a certain size. Isn't a divine ruler precisely someone who is more deserving and more able to absorb resources? Aren't the lower orders people who would not appreciate luxuries and indeed have fully internalized such a fact ("Not for the likes of me")

If you skip the equality requirement, it seems history is full of utilitarian societies.

Attention control is critical for changing/increasing/altering motivation

After reading this post I came across this Bruce Lee quote which seemed in synch with the idea:

“I’ve always been buffeted by circumstances because I thought of myself as a human being affected by my outside conditioning. Now I realize that I am the power that commands the feeling of my mind and from which circumstances grow.”

I wonder if empirically and instinctively, Bruce had arrived at the same concept as this post explores.

A survey of anti-cryonics writing

Thanks for saving me from karmic hell, but I still don't see the conflict. I seem to follow the Vinge version, which doesn't appear to be proscribed.

I may have been too categorical, obviously one can make all the predictions he likes, and some with a high percentage of certainty, for instance "If cryorevival is possible then post singularity it will be trivial to implement" but that still doesn't give us any certainty that this will be so, for instance a post singularity paperclip maximizer would be capable of cryorevival but have no interest in it.

A survey of anti-cryonics writing

Depends on your objectives. If you believe the singularity is something that will happen regardless then it's harmless to spin scenarios. I gather that people like Elizier figure that the Singularity will happen unavoidably but that it can be steered towards optimum outcomes by setting down the initial parameters, in which case I suppose it's good to have an official line about "how things could be/how we want things to be"

A survey of anti-cryonics writing

God forbid someone might mistake our hypothetical discussions about future smarter than human artificial intelligences for science fiction.

A survey of anti-cryonics writing

And yet population nowadays is so much larger than in ancient times so there are claims the absolute number of slaves is currently higher than ever before

6kragensitaker12yThe number given in that article is 27 million slaves. Yet Wikipedia claims [http://en.wikipedia.org/wiki/World_population#Population_figures] that 55 million people lived in the Roman Empire in AD 300-400. Were less than half of them slaves? (And that's ignoring the slaves in the rest of the world at the time.) The same page claims that in 1750, the world population was almost 800 million. Were 29 out of 30 people at the time really free? Surely slavery was more widespread among the hierarchical city cultures that left written records than among the "barbarians", but it's hard to imagine that the number has always been less than 27 million. During the Middle Ages, throughout all of Europe, the vast majority of the people were serfs, bound to their land, living and dying at the mercy of their lords. If it's true that 249 out of every 250 people today is free, that sounds like a huge improvement over almost all of human history.
2Kutta12yMoral standards in general can improve irregardless of the number of people involved. Besides, one could argue that having more slaves is outweighed by having much more non-slaves living good lives. In regard to cryonics: all else being equal I'd favor to be reanimated in a world with low slave to non-slave ratios if I preferred the probability of my becoming a slave to be as low as possible.
A survey of anti-cryonics writing

I've also encountered people who criticize the predictions surrounding the singularity, which misses the point that the singularity is the point beyond which predictions cannot be made.

edit: Didn't mean that as a comprehensive definition.

-1timtyler12yThere is no "point beyond which predictions cannot be made". That is a SF fantasy.

That is not the most common usage here. See Three Singularity Schools and the LW wiki page.

EDIT: The parent comment does not deserve to be at -4. This is a reasonable thing for an inexperienced commenter to say.

0wedrifid12yI dispute that point.
7ata12yIf that were true about the Singularity, then wouldn't it be correct to criticize the people who make predictions about it?
The Least Convenient Possible World

"first, do no harm"

It's remarkable that medical traditions predating transplants* already contain an injunction against butchering passers by for spare parts

*I thought this was part of the Hippocratic oath but apparently it's not

0thomblake12yAn injunction to do no harm is part of the Hippocratic oath, and the actual text has multiple translations, so I don't think it's too far-fetched to attribute "first, do no harm" to the oath.
A Much Better Life?

I agree - I think the original post is accurate in what people would respond to the suggestion, in abstract, but the actual implementation would undoubtedly hook vast swathes of the population. We live in a world where people already become addicted to vastly inferior simulations such as WoW already.

1Shae12yI disagree. I think that even the average long-term tortured prisoner would balk and resist if you walked up to him with this machine. In fact, I think fewer people would accept in real life than those who claim they would, in conversations like these. The resistance may in fact reveal an inability to properly conceptualize the machine working, or it may not. As others have said, maybe you don't want to do something you think is wrong (like abandoning your relatives or being unproductive) even if later you're guaranteed to forget all about it and live in bliss. What if the machine ran on tortured animals? Or tortured humans that you don't know? That shouldn't bother you any more than if it didn't, if all that matters is how you feel once you're hooked up. We have some present-day corrolaries. What about a lobotomy, or suicide? Even if these can be shown to be a guaranteed escape from unhappiness or neuroses, most people aren't interested, including some really unhappy people.
The AI in a box boxes you

Indeed, in fact if many worlds is correct then for every second we are alive everything terrible that can possibly happen to us does in fact happen in some branching path.

In a universe that just spun off ours five minutes ago, every single one of us has been afflicted with sudden irreversible incontinence.

The many worlds theory has endless black comedy possibilities, I find.

edit: this actually reminds me of Granny Weatherwax in Lords and Ladies, when the Elf Queen threatens her with striking her blind, deaf and dumb she replies "You threaten me with... (read more)

The AI in a box boxes you

I had thought of a similar scenario to put in a comic I was thinking about making. The character arrives in a society that has perfected friendly AI that caters to their every whim, but the people are listless and jumpy. It turns out their "friendly AI" is constantly making perfect simulations of everyone and running multiple scenarios in order to ostensibly determine their ideal wishes, but the scenarios often involve terrible suffering and torture as outliers.

2Document12yFor the record, EY considers that a legitimate danger [http://lesswrong.com/lw/x4/nonperson_predicates/].
1Nisan12yAs long as the simulations which involve terrible suffering constitute a tiny proportion of the simulations, your response ought to be the same as if there is only one copy of you and it has a tiny probability of suffering terribly – which is just like real life. ETA: What you ought to worry about is what will happen to you after the AI is done with the simulation.
Open Thread: February 2010

I guess if you have the technology for it the "AI box" could be a simulation with uploaded humans itself. If the AI does something nasty to them, then you pull the plug

(After broadcasting "neener neener" at it)

This is pretty much the plot of Grant Morrison's Zenith (Sorry for spoilers but it is a comic from the 80s after all)

Logical Rudeness

This is true, not only is it practical but it also makes a good rhetorical hammer, for example I once started an argument with a truther friend asking him what exactly he believed, "for instance, do you believe all the Jews were evacuated before the planes hit?". Forcing someone defending an irrational belief to first disassociate himself from all the really nutty stuff hanging on to his position works wonders.

I should probably remember to do more Socratic debating in friendly debates with incoming novices - never make a statement yourself if you can ask a question that will get the other person to make it.

Beyond the Reach of God

Last night I was reading through your "coming of age" articles and stopped right before this one, which neatly summarizes why I was physically terrified. I've never before experienced sheer existential terror, just from considering reality.

The Prediction Hierarchy

My grasp of statistics is atrocious, something I hope to improve this year with an open university maths course, so apologies if this is a dumb question:

Do the figures change if you take "playing the lottery" as over the whole of your lifespan? I mean, most of the people I know who play the lottery make a commitment to play regularly. Is the calculation affected in any meaningful way? At least the costs of playing the lottery weekly over say 20 years become much less trivial in appearance

0RobinZ12yAs mattnewport and LucasSloan point out, it doesn't change the actual numbers - a bad bet multiplied a thousandfold is still a bad bet - but it does change the wrong numbers: buying a thousand tickets for a 0.01% chance of a million dollars is a losing bet again.* More evidence that the ignorance argument fails. -------------------------------------------------------------------------------- * How I calculate this (changes in italics):
1LucasSloan12yYour odds of winning once go up as you increase the number of tickets you buy (# of tickets purchased * Chance of winning per ticket). The expected value of a given ticket remains the same. All you are doing is focusing more money away from other possibilities. If you buy 5 tickets a week for your entire life, and the odds of winning are 1 in 100 million, then you have a 0.000169 chance of winning the lottery, but you could have spent your 16 thousand on a new TV or a vacation.
2mattnewport12yIf by 'do the figures change' you mean 'does it ever become a good bet' then no.
Reference class of the unclassreferenceable

I think we're saying the same tihng - the singularity has happened inside the box, but not outside. It's not as if staring at stuff we can't understand for centuries is at all new in our history, it's more like business as usual...

Reference class of the unclassreferenceable

But that's not an actual singularity since by definition it involves change happening faster than humans can comprehend. It's more of a contained singularity with the AI playing genie doling out advances and advice at a rate we can handle.

That raises the idea of a singularity that happens so fast that it "evaporates" like a tiny black hole would, maybe every time a motherboard shorts out it's because the PC has attained sentience and transcended within nanoseconds .

0DanArmak12yA Singularity doesn't necessarily mean change too fast for us to comprehend. It just means change we can't comprehend, period - not even if it's local and we sit and stare at it from the outside for 100 years. That would still be a Singularity.
Rationality Quotes: October 2009

Incidentally, the Spanish inquisition did not believe in witches either, dismissing the whole thing as "female humours"

Non-Malthusian Scenarios

The fact is we as large complex mammals are already locked into a low rate of reproduction, sure given the right evolutionary pressures we could end up like shrews again, but that would take an asteroid strike or nuclear war, the scenario you're thinking of assumes long term evolution within a very long lasting stable society essentially like ours. In those circumstances genes for successful reproduction will spread through the population, but that's largely meaningless - if I have the gene for super attractiveness and manage to have 100 kids with 100 wom... (read more)

0wedrifid12yApart from the aforementioned confusion regarding investment in male vs female children, this isn't a new balance either.
1DanArmak12yIf a woman maximizes reproduction (and so fitness) by having more sons than daughters, then why doesn't the population tilt towards a male:female ratio > 1?
Non-Malthusian Scenarios

But the new universes also have their own population, though I guess you could colonize universes where humans don't arise rather than universes identical to this one except I didn't scratch my nose just now

0gwern12yOr just enter the universe early. Suppose our universe were created suchly, and for some reason the creators just had to have a metallic terrestrial planet. Even waiting for all the necessary supernovae, they still have a good billion or 2 years to exploit the Earth before we arose. And given the Great Silence, it might just be that exploiters don't need to worry about competition in the new universe.
Non-Malthusian Scenarios

How about simple spontaneous population stability... I live in a country with negative birth rate but the population is increasing due to immigration nevertheless. This state of affairs hasn't been legislated into existence, it just happened, and may be a natural behaviour of large human populations. Perhaps once the whole world reaches western standards of living the whole world will stop growing exponentially, with pockets of negative growth being compensated by low but positive growth in others... in the long term the trend could even be for decreasing population...

3Neil12yIn the long term (and I mean the very long term) people will evolve to get around the obstacles that stop them producing the children they could. If contraception decouples sex from reproduction, people will evolve to be less interested in sex and more directly interested in babies. If entertainment proves more compelling than having kids, people will evolve to be less entertainable. If being a responsible, well adjusted person is limiting family size, people will evolve to be irresponsible, poorly adjusted people.
0James_K12yThis occurred to me too, but on second reflection this seems like a special case of the "Selfish Memes" scenario. The meme being "the costs of (marginal) children are not worth the benefits". For what its worth, I also think wealth-induced population stability is a real possibility.
The Finale of the Ultimate Meta Mega Crossover

I guess I'll be back once I've read Permutation city then...

Torture vs. Dust Specks

Although if we factor in consequences, say... being distracted by a dust speck in the eye while driving or doing any other such critical activity then statistically those trillions of dust specks have the potential to cause untold amounts of damage and suffering

Torture vs. Dust Specks

There is a false choice being offered, because every person in every lifetime is going to experience getting something in their eye, I get a bug flying into my eye on a regular basis whenever I go running (3 of them the last time!) and it'll probably have happened thousands of times to me at the end of my life. It's pretty much a certainty of human experience (Although I suppose it's statistically possible for some people to go through life without ever getting anything in their eyes).

Is the choice being offered to make all humanities eyes for all eternity immune to small inconveniences such as bugs, dust or eyelashes? Otherwise we really aren't being offered anything at all.

0Bugle12yAlthough if we factor in consequences, say... being distracted by a dust speck in the eye while driving or doing any other such critical activity then statistically those trillions of dust specks have the potential to cause untold amounts of damage and suffering
Why You're Stuck in a Narrative

But remember like Alan Moore said "The one place where gods exist unquestionably is the human mind". Similarly, narratives are a fact of not just your brain, but that of everyone around you, realizing this can be convenient.