The following should sound familiar:

A thoughtful and observant young protagonist dedicates their life to fighting a great world-threatening evil unrecognized by almost all of their short-sighted elders (except perhaps for one encouraging mentor), gathering a rag-tag band of colorful misfits along the way and forging them into a team by accepting their idiosyncrasies and making the most of their unique abilities, winning over previously neutral allies, ignoring those who just don't get it, obtaining or creating artifacts of great power, growing and changing along the way to become more powerful, fulfilling the potential seen by their mentors/supporters/early adopters, while becoming more human (greater empathy, connection, humility) as they collect resources to prepare for their climactic battle against the inhuman enemy.

Hmm, sounds a bit like SIAI!  (And while I'm throwing stones, let me make it clear that I live in a glass house, since the same story could just as easily be adapted to TSI, my organization, as well as many others)

This story is related to Robin's Abstract/Distant Future Bias

Regarding distant futures, however, we’ll be too confident, focus too much on unlikely global events, rely too much on trends, theories, and loose abstractions, while neglecting details and variation.  We’ll assume the main events take place far away (e.g., space), and uniformly across large regions.  We’ll focus on untrustworthy consistently-behaving globally-organized social-others.  And we’ll neglect feasibility, taking chances to achieve core grand symbolic values, rather than ordinary muddled values.

More bluntly, we seem primed to confidently see history as an inevitable march toward a theory-predicted global conflict with an alien united them determined to oppose our core symbolic values, making infeasible overly-risky overconfident plans to oppose them.  We seem primed to neglect the value and prospect of trillions of quirky future creatures not fundamentally that different from us, focused on their simple day-to-day pleasures, mostly getting along peacefully in vastly-varied uncoordinated and hard-to-predict local cultures and life-styles. 

Living a story is potentially risky, for example Tyler Cowen warns us to be cautious of stories as there are far fewer stories than there are real scenarios, and so stories must oversimplify.  Our view of the future may be colored by a "fiction bias", which leads us to expect outcomes like those we see in movies (climactic battles, generally interesting events following a single plotline).  Thus stories threaten both epistemic rationality (we assume the real world is more like stories than it is) and instrumental rationality (we assume the best actions to effect real-world change are those which story heroes take).

Yet we'll tend to live stories anyway because it is fun - it inspires supporters, allies, and protagonists.  The marketing for "we are an alliance to fight a great unrecognized evil" can be quite emotionally evocative.  Including in our own self-narrative, which means we'll be tempted to buy into a story whether or not it is correct.  So while living a fun story is a utility benefit, it also means that story causes are likely to be over-represented among all causes, as they are memetically attractive.  This is especially true for the story that there is risk of great, world-threatening evil, since those who believe it are inclined to shout it from the rooftops, while those who don't believe it get on with their lives.  (There are, of course, biases in the other direction as well).

Which is not to say that all aspects of the story are wrong - advancing an original idea to greater prominence (scaling) will naturally lead to some of these tropes - most people disbelieving, a few allies, winning more people over time, eventual recognition as a visionary.  And Michael Vassar suggests that some of the tropes arise as a result of "trying to rise in station beyond the level that their society channels them towards".  For these aspects, the tropes may contain evolved wisdom about how our ancestors negotiated similar situations.

And whether or not a potential protagonist believes in this wisdom, the fact that others do will surely affect marketing decisions.  If Harry wishes to not be seen as Dark, he must care what others see as the signs of a Dark Wizard, whether or not he agrees with them.  If potential collaborators have internalized these stories, skillful protagonists will invoke them in recruiting, converting, and team-building.  Yet the space of story actions is constrained, and the best strategy may sometimes lie far outside them.

Since this is not a story, we are left with no simple answer.  Many aspects of stories are false but resonate with us, and we must guard against them lest they contaminate our rationality.  Others contain wisdom about how those like us have navigated similar situations in the past - we must decide whether the similarities are true or superficial.  The most universal stories are likely to be the most effective in manipulating others, which any protagonist must due to amplify their own efforts in fighting for their cause.  Some of these universal stories are true and generally applicable, like scaling techniques, yet the set of common tropes seems far too detailed to reflect universal truths rather than arbitrary biases of humanity and our evolutionary history.

May you live happily ever after (vanquishing your inhuman enemy with your team of true friends, bonded through a cause despite superficial dissimilarities).

The End.

New to LessWrong?

New Comment
62 comments, sorted by Click to highlight new comments since: Today at 11:24 AM

So far as SIAI is concerned, I have to say that the storylike qualities of the situation have provided no bonus at all to our PR rolls, just a penalty we have to be careful to avoid because of all the people who perceptually recognize it as a mere story. In other words, we lose because of all the people trying to compensate for a bias that doesn't actually seem to be there. People who really are persuaded by stories, go off and become religious or something, find people with much better-refined attractive stories than us. Our own target audience, drawn from the remainder, tends to see any assertion classifiable as storylike as forbidden-in-reality because so many stupid people believe in them.

And of course much of your story simply and obviously isn't applicable. There will be no robot war, there don't seem to be any hostile human parties trying to bring about the apocalypse either, the question gets resolved in a research basement somewhere and then there's no final battle one way or another.

But then if you'd left out the part about the Robot War and the final battle, your opening paragraph wouldn't have sounded as clever. And this is also something that happens to us a LOT, which is that people arguing against us insist on mapping the situation very exactly onto a story outline, at the expense of accuracy, so that it looks stupider.

All the bias here seems to be in the overcompensation.

I can see how for your audience, the story-like qualities would be a minus. On the other hand, I think the story bias has to do with how people cognitively process information and arguments. If you can't tell your mission & strategy as a story, it's a lot harder to get across your ideas, whatever your audience.

The battle was meant to be metaphorical - the battle to ensure that AI is Friendly rather than Unfriendly. And I didn't say anything about hostile humans - the problem is indifferent humans not giving you resources.

Also, I'm not arguing against SIAI, I just find it amusing how well the futurist sector maps onto a story outline - various protagonists passionate about fighting some great evil that others don't see and trying to build alliances and grow resources before time runs out. You can squiggle, but that's who you are. Instrumental rationality means figuring out how to make best positive use of it and avoid it biasing you.

The battle was meant to be metaphorical - the battle to ensure that AI is Friendly rather than Unfriendly.

By this standard, just about anything worth taking as one's life work will involve a metaphorical battle.

Actually, I will comment (for the purposes of authenticity and from the belief that being more transparent about my motivations will increase mutual truth-finding) that while I'm not arguing "against" SIAI, this post is to some degree emerging from me exploring the question of SIAI's organizational instrumental rationality. I have the impression from a variety of angles/sources that it's pretty bad. Since I care about SIAI's success, it's one of the things I think about in the background - why, and how you could be more effective.

When discussing SIAI's instrumental rationality, it's important to remember what its actual goals are. Speaking of story-bias, it's all too easy to pattern-match to "organization promoting some cause they think is important", in which case one easily concludes that SIAI has been a miserable failure because FAI hasn't become a trendy academic research discipline, and Vice Presidents aren't making films about paperclip maximizers.

However, the picture changes somewhat if instead you think in terms of the following (more accurate) caricature of SIAI's actual objectives:

(1) To persuade a dozen or so Putnam Fellows to collaborate with Eliezer on FAI instead of pursuing brilliant careers in academic mathematics;

(2) To dissuade people like Ben Goertzel from trying to build AGI without solving the FAI problem first.

If you look at it like this (still admittedly oversimplified), then yes, SIAI still has a way to go in achieving its goals, but they don't seem to be quite as hopelessly underequipped for the task as one might have thought.

(Disclaimer: I certainly don't speak for SIAI; my association with the organization is that of a former visitor, i.e. about as loose as it's possible to get while still having to answer "yes" to the question "Are you, or have you ever been, affiliated with the Singularity Institute for Artificial Intelligence?" if ever called to testify before Congress....)

this post is to some degree emerging from me exploring the question of SIAI's organizational instrumental rationality. I have the impression from a variety of angles/sources that it's pretty bad. Since I care about SIAI's success, it's one of the things I think about in the background - why, and how you could be more effective.

I've had similar thoughts. I would be interested in hearing what room for improvement you see in SIAI's organizational instrumental rationality. I have my own thoughts on this (which have evolved somewhat as I've learned more since making my posts about SIAI back in August). Feel free to PM me if you'd prefer to communicate privately.

It's actually a disaster story, not a battle story, which is something I'm surprised Patrissimo missed in the opening paragraph.*

*(Possibly because disaster movie protagonists tend to be older, so it wouldn't fit that bit quite as well)

Your "enemies" are those too shortsighted to realise the true dangers, and your aim is to reveal the dangers to them, and save the world.

But as others have stated, with sufficient attention any story can be mapped to a combination of tropes (inverted, subverted and played straight).

Possible dangers of thinking you're living in a story:

Believing that if you're right and/or virtuous, you're going to win

Underestimating the power and usefulness of large organizations

Believing that what you're doing is the main story.

Of course, this is assuming that the story you think you're in is a certain kind of popular fiction.

If you thought that you were living in a realistic/naturalistic story, you'd be underestimating your chance of making a significant difference. I have no idea what thinking you were living in a tragedy would do to your presuppositions.

After Douglas Adams' death, I read a fair number of tributes which said his books had a large emotional effect-- people absorbed the attitude of expecting things to be absurd. I don't know what effect that's had on their lives.

More:

  • Overestimating the agency of everyone and everything
  • Role-playing instead of trying to achieve goals
  • Expecting too clean a distinction between protagonists and antagonists
  • Underestimating the number, and overestimating the cohesiveness, of protagonists
  • Overly anticipating unlikely but dramatic events

(Most of these are more accurately described as "errors almost everyone makes all the time" than "dangers of thinking you're in a story", but thinking of them that way seems pretty useful for identifying them.)

Edit: After reading this comment, I'll amend that to "Assuming villainy is the usual explanation for apparent bad behaviour."

This is so common as to be an adage: "Never attribute to malice that which is adequately explained by stupidity." (http://en.wikipedia.org/wiki/Hanlon's_razor)

horrifyingly deep and complex behaviour generated by perfectly normal blithering stupidity

So what is villainy, if it's not that?

They mean well, rather than being out to deliberately fuck you up.

If you thought that you were living in a realistic/naturalistic story, you'd be underestimating your chance of making a significant difference. I have no idea what thinking you were living in a tragedy would do to your presuppositions.

I don't know if you have peasant ancestry, especially peasants living in a non- or partially-democratic society; I do, and when I talked to my grandparents and granduncles, as a young boy raised on hero stories and supported by his family, I was struck by how tremendously fatalistic their outlook on life was. While they worked hard and smart, and aimed at steadily improving their lot in life (and, eventually, succeeded - my father was able to get an engineering degree), the idea that they were and would always be part of the background scenery of history was very fixed in their mind; no matter how proud they were of their debt-free house and comfortable retirement money, they never even considered that it might be possible for them to be concerned about matters beyond their family's immediate needs - that was "gentlemen's stuff", while they were but "poor people".

To cut short, my point is that that type of literature is called "Realism" or "Naturalism" for a reason - because to a significant degree, and especially when compared with Romantic literature, that was actually how nineteenth-century peasants thought and lived (and not just peasants, I believe - Joseph Roth's Radetzkymarsch deals with the middle class but it's one of the most depressing novels I've ever read). The answer to your question is in those books themselves.

That's interesting. My background is various Russian and eastern European Jewish, and the default assumption seems to be that you can build a decent life, but playing on the larger stage just isn't thought of. It's not even viewed as "gentlemen's stuff", it's just a blank spot.

If you don't mind, I'll post your comment above to my livejournal-- at this point, I'd like to get more people's take on their family culture and ambition.

Sure, no problem.

Anything can be mapped to tropes, about the same way anything can be mapped to one or another prophecy of Nostradamus if you try hard enough. Aside from that it is possible to map the Singularity to tropes, what evidence is there that this is actually happening?

Five months ago, someone asked the question "What kind of evidence would convince you that you were in a story?". Will Newsome answered:

If something totally crazy seemed like it was about to happen and the world was at stake, like a technological singularity was about to occur or something, and I was called to work for the team of great minds that were trying their hardest to stop the destruction of the entire universe, dropping out of high school in the process, and meeting a beautiful girl who had been living literally a few houses down from me for the last 4 years without my knowing about it, who just so happened to be doing an essay on transhumanism for her English class and would just love to interview someone who was doing work for the Singularity Institute.

Oh wait...

Anything can be mapped to tropes, but not all tropes are the same. It matters what tropes your life, mission, or organization are mapped to! To skillfully navigate the world (I guess the LW term is "to win") you must know what tropes are being mapped to you, and what tropes your brain sees your identity as fitting into. That way you can manipulate others' perception of you (what stories are they telling about you? How are they telling those stories? Do they gain you status and resources), as well as making sure you aren't fooling yourself.

While I was deeply amused the first time you wrote that, I'm not sure that that is evidence of everything being mappable to tropes so much as how much common tropes are informed and shaped by actual history.

My actual experience is that people who partly think they live in a story are more psychologically stable than the other classes of people who attempt difficult things autonomously. My biggest concern about them relates to my skepticism that such people can cooperate effectively for an extended period with other people similar to themselves. I would want to work for such a person because they would work hard, but I'd worry about working for a partnership of two such people.

Given the breadth of TVTropes, you could make a description like that about almost any group or any idea, independent of the truth of the idea. The love for narrative cut both ways. It might make people think they're living in a story, but it's story like element we are most prone to pick out of others people's live too.

Nobody talks about what Eliezer had for breakfast because that's not part of "the story we tell". But telling a story is just another (edit:) way to say "picking out relevant details".

Can we please get back to substantive arguments against the scary idea?

First, I'm not claiming a connection between truth and tropism, but this idea that everything is equally tropish seems wrong. Not everyone has the role of a protagonist fighting for humanity against a great inhuman evil that only they foresee, and struggling to gather allies and resources before time runs out. Yet Eliezer has that role.

Second, even though tropes apply to everyone's lives to some degree, it matters which tropes they are. For example, someone who sees themselves as a fundamentally misunderstood genius who deserves much more than society has given them is also living a trope - but it's a very different trope with very different results. Identifying the tropes you are living is useful - it helps in your personal branding, can teach you lessons about strategies for achieving your goal, and may show you pitfalls.

For example, I live a very similar trope set to Eliezer, which is why I notice it, and it poses many challenges in being effective, because it's tempting to (as Nick alluded to above) play the role rather than doing the work.

Not everyone has the role of a protagonist fighting for humanity against a great inhuman evil that only they foresee, and struggling to gather allies and resources before time runs out. Yet Eliezer has that role.

No. The UFAI is nonexistent, and therefore noncombatant. I'm not sure Eliezer has even tried to make the case that UFAI is the most likely existential risk. Lots of people see serious huge risks in our future. To say nothing of the near-constant state of death. EY certainly wasn't first with the concept of world-killing UFAI in general, arguably he's late to the game.

I can't think of a story in which the protagonist spend lots of time trying to do things the majority doesn't want to try or don't think are hard, but it sounds like a comedy.

I'm sure I'm not the only one tempted here to make up some top-level post about how Eliezer chooses what he had for breakfast in a completely rational manner, and carefully avoids biases such as how his mother raised him on boiled eggs or how delicious chocolate chips taste, but thinks things out well enough that he does not have to jump through the same logical hoops every single morning...

I'm sure I'm not the only one tempted here to make up some top-level post about how Eliezer chooses what he had for breakfast in a completely rational manner

It sounds silly, but that's actually a nontrivial and important decision, where rationality has high returns, and where the default is to get it wrong. Everyone should consider the what-to-have-for-breakfast question explicitly, and rationally.

The more that I think about it, the more that I like the idea. It could actually be a relatively amusing cliff notes on various facets of rationality, a kind of who's who to the memes around here (e.x. "Luminosity Bella consults her notes on how she liked what she had last week, seeing what kind of impact her introduction of cougar blood is having").

Or it could just be like one of those angry essays Eliezer does sometimes, with such lines as "and they just THROW THE BUTTER ONTO THEIR POTATOES like it's a friend of the family", in which case healthy eating would be a metaphor for cyronics.

Or it could just be like one of those angry essays Eliezer does sometimes, with such lines as "and they just THROW THE BUTTER ONTO THEIR POTATOES like it's a friend of the family", in which case healthy eating would be a metaphor for cyronics.

I find almost everything about this sentence baffling.

Several of Eliezer's essays have a somewhat angry tone to them, particularly those arguing that humans need to develop means by which we no longer have to die, expressing a frustration with the typical human's resignation to death. Similarly, they are resigned to eating butter because it's a normal kind of thing and their parents did it and just a little can't be that bad for you.

Butter's a friend of my family.

This would probably be exactly the kind of thing that would be being opposed -- i.e. that we have deep sentimental attachments to things like butter, partly due to their nature as superstimuli (although as superstimuli go, butter is pretty mild). And even if butter feels like a friend of the family, a sentimental attachment is probably not paying a ton of rent.

Eh, a sentimental attachment doesn't necessarily have to be irrational or untested. My attachment to butter thoroughly pays rent every time it correctly predicts that butter is delicious.

You'd better just stay away from the bacon subreddit.

Do you have any particular tips? I'm not advanced in the field of nutrition, but this seems like as good a place as any to start.

Not all carbs for breakfast; some or all of it should be fat and protein. Pay attention to when you get hungry and adjust the serving size to ensure it's not before you plan to eat next. Have a portable, nonperishable backup in case you run out of an ingredient or are in a hurry, since skipping a breakfast entirely is much worse than eating the wrong thing.

Try some different breakfast foods and pay attention to (or better, takes notes on) how you feel later in the day after having each. (Individual biochemistries vary). Pay special attention to headaches and mind fog, no matter how minor; these are almost always caused by diet somehow, though there are many mechanisms and finding the specific causes may require some experimentation.

Avoid sugar water, even if it's fruit-themed or fruit-derived. Take a standard multivitamin/multimineral (very important, but it doesn't matter which meal it's with); pay no attention to whether other foods contain vitamins too. Take vitamin D unless you live close to the equator, spend lots of time outside, or confirm with a blood test that you don't need it.

If you use caffeine, have a policy for when and how much, and be disciplined about it. Don't get your caffeine and hydration from the same source (eg, only drinking caffeinated soda); that'll make your intake unpredictable, which is bad. If you don't know what the early stages of caffeine withdrawal feel like, induce it under controlled circumstances and pay attention, so you'll know if you've messed up that way.

pay no attention to whether other foods contain vitamins too.

Do, however, pay attention to whether other foods contain micro-nutrients not rated as vitamins. Unless, of course, your multivitamin includes them. Obvious this is less, um, vital.

I just assumed Eliezer Yudkowsky eats dementors for breakfast.

[-][anonymous]13y00

And that's a fact!

It would be nice to have stories about heroes who are aware of and resist the temptation to make their lives into stories. Yes there are a few superficial moves in this direction, but I'd love to see something more systematic.

Have you read Terry Pratchett's Witches Abroad?

And, of course, Granny Weatherwax.

That's basically the Reluctant Hero archetype. Of course, from the writer's viewpoint, the outstanding feature of that kind of hero is that you have to make sure his attempt to keep a non-story life fails or, well, you won't have a good story.

It would be nice to have stories about heroes who are aware of and resist the temptation to make their lives into stories.

Somehow I don't think those would make very good stories.

If the heroes were successful at preventing their lives from becoming stories, then certainly, their lives would not make good stories (or any other sort). I'd be rather amused to see a case where one failed.

There's this book called The Hobbit.

;-)

That reminds me of the movie "Stranger Than Fiction".

Mieville's Un Lun Dun has somewhat in that direction.

Znva punenpgre tbrf qverpgyl gb fbyivat ceboyrz engure guna pbyyrpgvat cybg pbhcbaf. Nyfb, cebcurpvrf ghea bhg abg gb jbex.

I'm not a protagonist. I'm a minor side character.

I'd prefer to be the comic relief, but I worry I'm a redshirt!

So part of winning is being able to deal with human susceptibility to think in stories.

Stories are powerful memes. But they're always open to subversion. What would the story look like if it were rewritten ten times the size by Alan Moore, for example? I mention Alan Moore here because of his method of taking the simple story and working out the full implications - he works primarily in stories about stories. Imagine the story of SIAI written by Alan Moore. What will he do with it? What should you do with your perceptions of the story?

(I've been reading a lot of Alan Moore this weekend, alternating between the entire runs of Promethea, Supreme and Albion and LessWrong and RationalWiki. The Alan Moore thought experiment is jolly good fun to apply to the world of people.)

"So part of winning is being able to deal with human susceptibility to think in stories."

Exactly! It is especially relevant if you are trying to grow a following around an idea, which SIAI is. Winning requires wearing your Slytherin hat sometimes, and an effective Slytherin will manipulate the stories that they tell and the stories that are told about them.

I read an interesting self-help book recently where the main idea is that you can tell better stories ("hero stories" instead of "victim stories") about yourself, others, and your situation. Be the Hero by Noah Blumenthal.

SIAI's story by Frank Miller would be less insightful, but probably much more colorful.

Actually, I suspect SIAI by Frank Miller would be almost colorless, with the occasional shocking splash of red etc. ;)

SIAI by Frank Miller would be in three parts, the first two in black and white, and then (after the dystopian Singularity) it would be in full colour.

I have attempted a relationship with someone who I would characterise as having a script in their head which they expected me to follow. When I failed to play my expected character role, they would explode in anger. Life was not sensible or reasonable: stupid expectations that might make a story with them at the middle. The person thought of themself as a good person and tried to be a good person, but their modelling of other people was relentlessly disastrous and they pretty clearly had something along the lines of histrionic personality disorder or some related dramatic cluster personality disorder - though I must of course stress that this was undiagnosed and that I am not a psychiatrist, psychologist or even counselor. ALL DRAMA! ALL THE TIME! There's a reason I quite enjoy the dullness of quiet domesticity.

[-][anonymous]13y00

If nothing else this post will be useful to link to whenever this subject is brought up. We can link here, link to the useful responses and save wasting our time engaging with repetitious deep sounding yet superficial posturing.