Epistemic status: highly confident in the basic distinct, though it's not at all profound. Further details, models, and advice are somewhat speculative despite being drawn from varying amounts of observation. Essay a little rushed since otherwise I’d be unlikely to publish.

When setting out on a venture, one faces some choices.

  • Optimize reality: How much do you optimize for the success of your stated goal? I define goal to be some desired state of reality such that success or failure is assessed with respect to reality.
  • Optimize stories: How much do you optimize for the appearance of success at your stated goal, i.e. stories? I define a story as a collection of facts about a reality asserted to be true and relevant, usually presented to others but sometimes only to oneself.

Ideally, optimizing for one would be the same as optimizing for another. Very often, however, optimizing one is not the same as optimizing for other. What’s worse, the two ends compete for the same set of limited resources.

A concrete example might be someone who sets out to develop and sell a medicinal tea which relieves hangover symptoms. Overall success is selling your tea and making money. Investing in optimizing reality would mean investing in experiments to develop and improve the tea. Introducing variants, randomized controlled trials, etc., etc. Optimizing story means having a really good explanation for why your tea is able to do what you claim it does, plus having a great website with good copy, publishing testimonials, publishing your methodology and experimental results, etc.

Success can be attained by both, in combination, but also possibly each on its own. If you have a tea which works well, word will spread and you’ll end up with many customers seeking your genuinely palliative tea. Alternatively, if your marketing materials are persuasive, you might accrue many customers too, even if your tea is no better than placebo. Of course, truly effective tea plus a well-conveyed story about its great properties will generate more sales than effective tea or a good story alone.

Optimizing Reality

The following points are worth noting:

  • When optimizing reality, success depends on reality at large being a certain way.
  • There is only one reality when it comes to what you’re optimizing for.
  • You can’t fool reality. It is what it is no matter what you say or do.
  • Optimizing for reality is easiest to do when outcomes are easy to measure and feedback is quick.
  • Engineers dealing with concrete, measurable phenomena are likely to be optimizing reality.
  • Genuinely creating value, though often expensive, is a good strategy for capturing value to yourself. Likely it’s the best long-term strategy in many domains.
    • I would claim that every major successful company is producing clear value to someone. Facebook, Google, AirBnB, ExxonMobil, United Airlines, Starbucks. At the end of the day, there’s something real there.

Optimizing Stories

  • For any goal, there might be many possible stories which will lead to success.
  • Stories don’t have to be true or accurate to be effective.
  • It can be rational, or outright necessary, for one’s success to optimize for stories without regard to reality.
    • If your success ultimately depends on someone else being convinced, then reality at large doesn’t matter.
  • Salespeople, marketers, politicians, startup founders and generally those whose success depends on persuading others will spend much of their effort optimizing stories.
  • Very often your “customer” only cares that you have a “good story” and not at all that your story matches reality.
    • Consider a reseller of your medicinal tea. If they’re unscrupulous, it doesn’t matter to them whether or not the tea works. It only matters to them that your story is good enough to persuade end-consumers.
    • Consider a hospital IT employee whose job is to purchase software which helps doctors. The supposed goal of improved doctor efficiency is not measured in any way. The hospital IT employee’s success will be judged by the impressive-seemingness of the software they select - how good the story it comes with - not any actual reality. In other words, the IT employee, as a customer, is incentivized to be shopping for good stories and only good stories.
      • But you needn't assume people only purchase stories for the sake of others! Many people, having a loose relationship, are happy to purchase a story which makes them feel good. "Oh, those crystals from the mystery mountains have the right frequencies to bring me good fortune? Yes please!"
  • In cases where those hearing the stories lack the expertise to assess reality, e.g. non-experts trying assess an expert, "dumbed-down" stories comprehensible to the non-experts will completely outweigh reality itself. Cf. Overconfident talking down, humble or hostile talking up.
  • If it’s only stories which matter, yet you split your efforts between stories and reality, then you will likely be outcompeted by someone who spent all of their resources on crafting good stories. Cf. Moloch.
  • Even those who care a lot about reality itself can slide into a focus on stories if empirical feedback is absent or very slow.
  • Notwithstanding all of the above, stories will sometimes reach the end of their tether and the lack of a good reality to support your stories will catch up with you.
  • We don’t just tell stories to others, we also tell themselves to ourselves.
    • The danger of being too much in the habit of telling stories is that we don’t merely risking fooling others, but also ourselves.

“Story Economies”

I conjecture that what arises in our modern world are “economies” of stories whereby people buy and sell stories often without regard for reality.

Another example: imagine an analyst working at a startup crafts a report which highlights all the ways in which the company is rapidly improving. The analyst's manager isn’t too worried about whether the report is a bit biased towards the positive - they know the CEO will be pleased. The CEO doesn’t mind if the report is a bit biased towards the positive - they know the board will be pleased. The board doesn’t mind if the report is a bit biased - they know that the next round of investors won’t really be able to tell the difference, it will just make the company look good.

Here you have a whole chain of people who only care about the story. At the very end there’s someone who cares about the reality, but they’re very often not in a great position to evaluate it themselves. They probably don’t know even the right questions to ask. All they’ve got is the story which has been placed before them.

Some people gravitate more towards stories than others, e.g. salespeople and politicians. Some of them might readily admit that they chiefly deal in stories somewhat tenuously linked to reality, yet I wager that many, if not most, won’t. The most persuasive stories are those you devoutly believe yourself. Hence the vast overconfidence of startup founders. And, in the immortal words of George Costanza: it's not a lie... if you believe it.

Stories about yourself ...to yourself and others

If there’s one domain where we’re endlessly crafting and broadcasting stories, it’s the stories we tell about ourselves. I’m this kind of person. I might decide that the story I want to tell is that I’m a "science nerd". So I read science books and science magazines. I have my answer ready when people ask me what I do for fun and I know exactly what to post on social media. My Instagram is full of homemade volcanoes and photos from my personal backyard telescope.

The above fictional example might have the redeeming feature that at least this fictional person is creating a genuine reality to match the story. They are learning a tonne of science genre facts. Still, I wager there’s a tradeoff. Doing science-y things which are easily communicable and demonstrable introduces a constraint. Possibly leveling up as a scientist would mean reading textbooks with facts that are incomprehensible and boring to those not at that level. By trying to have the best story to tell, they’ve handicapped their own excellence. (However, if the story is primarily for oneself, this constraint is avoided. “I know just how science-y I am!)

If you want to know, I tell myself the story that I’m a person who’s afraid of losing myself to trying craft myself into someone optimized for impressing others. Though I do it. I’m doing it right now. There may be no escape.

Cf. Elephant in the Brain.

You can’t escape stories

At this point you might be thinking, “gee, stories are awfully deceitful and non-cooperate, I want to cooperative and honest and I’m just going to provide direct facts!” and “I really, really don’t want to deceive myself with stories!”

I don’t think you can escape stories entirely. I would claim that as soon as you summarize your facts or data, the mere selection of which facts to present or summarize is the crafting of a story. Even dumping all your data and every observation is likely to be biased by which data you collected and what you paid attention to. What you thought were the relevant things to report to another person.

That said, I think there’s storytelling which attempts to be honest effort to share reality as is so that someone else can make an informed judgment. It’s challenging if one’s success is threatened by less scrupulous competitors, but it’s possible to choose domains where measurable feedback favors those who’ve optimized actual reality.

You’re not always doing others a favor if you try to give them raw facts with no biased conclusions. The world is large and messy and confusing such that people usually like to be handed a story about who you are and how you will behave. They want you to be a nerdy, bookish type, or an outdoorsy type, or a foodie. If you give me a story and promise to act in accordance with it, that makes simple. It’s clear what to talk about, what to get you for your birthday, etc., etc.

At least for those spending much time out in mainstream culture, it helps to have one or two stories prepared about yourself. “Masks.” They function a bit like APIs, really. People often protest that they don’t like being put in boxes, but those boxes help you relate to people before you’ve spent the many, many hours to have absorbed the messy reality that any given human is.

What to do, what to do

Reality on the ground is complex, incentives are messy, things which work in the short run don’t necessarily work in the long run. I can't say “here’s my one simple recipe to determine the right allocation of resources between optimizing story vs optimizing reality.”

I proffer the obvious advice:

  • Notice the incentives for each domain you deal in - how much does your success depend on stories vs direct reality?
  • By judging the incentives in the domains you deal, assess how much you can judge the stories you are presented with.
  • Accept that the tradeoffs are hard. Personally, I wish I could deal only in truth and provide only open and transparent facts. Unfortunately, I might need either to compromise or to find myself severely disadvantaged in certain arenas, e.g. politics.
    • Moreover, the incentives mean despite myself, self-interest bias means I’m likely to present things to others in ways which favor myself.
  • Accept that you probably need a story about yourself. If you like, keep the story separate from yourself so that you might let yourself be more. Cf. Keep your identity small.

New to LessWrong?

New Comment
11 comments, sorted by Click to highlight new comments since: Today at 5:04 PM

This seems like a good explanation of the dynamic underlying “skin in the game” considerations. If you care about literally achieving the stated goal, you should strongly prefer stories from contexts where a story’s prominence has more to do with its entanglement with reality than with marketability.

What is a story?

It seems like it's a sort of compression optimized for human brains. Some elements of a story:

  • Components tend to be agents with intelligible social motives, or stereotyped roles; we have a bunch of specialized capacity for social modeling, which means that we can store information more efficiently if it fits that paradigm.
  • Components are "causally" linked. Because of causal linkage, stories tend to unfold unidirectionally over time. Stories are not lists of things that exist simultaneously, though they do require object permanence to understand; elements in a story tend to get reused. Once you understand a story, you can infer fuzzy or forgotten parts from the parts you know. Chekhov's Gun inference works in both directions.

It's all stories. There probably _is_ an underlying physical reality, but no humans experience it directly enough to have goals about it. I don't think your dichotomy is about reality vs stories. From your examples and descriptions, it seems to be about long-term vs short-term stories, or perhaps deep vs shallow stories.

The manager/ceo/board/investor acceptance of stories is only for a few years. Eventually customers won't agree, and it collapses anyway. conversely, there are plenty of examples of objectively worse products that did better in the marketplace, because the story is the only thing that matters.

Paper currency is a good example of a story with the weight of reality for a good chunk of humanity.

I don’t think you can escape stories entirely. I would claim that as soon as you summarize your facts or data, the mere selection of which facts to present or summarize is the crafting of a story. Even dumping all your data and every observation is likely to be biased by which data you collected and what you paid attention to. What you thought were the relevant things to report to another person.

I think we can say something stronger than this: we can't escape stories at all, because stories seem to be another way of talking about ontology (maps), and we literally can't talk about anything without framing it within some ontology.

It's tempting to want to claim direct knowledge of things, even if you are an empiricist, because it would provide grounding of your observations in facts, but the reality seems to be that everything is mediated by sensory experience at the least (not to mention other ways in which experience is mediated in things as complex as humans), so we are always stuck with at least the stories that our sensory organs enable (for example, your experience of pressure waves in the air as sound). I'd say it goes even deeper than that, being a fundamental consequence of information transfer via the intentional relationship between subject and object, but we probably don't need to move beyond a pragmatic level in the current discussion.

This is also why I worry, in the context of AI alignment, that Goodharting cannot be eliminated (though maybe we can mitigate it enough to not matter): representationalism (indirect realism) creates the seed of all misalignment between reality and the measurement of it, so we will always be in active effort to work against a gradient that seeks to pull us down towards divergence.

I probably didn't emphasize this enough in the main post, but the idea I'm really going for is that there is difference in optimizing for stories vs. optimizing for reality. There's a difference in goal and intention. Even if it's the case that human are never seeing "rock-bottom reality" itself and everything is mediated through experience, there is still a big difference between a) someone attempting to change an aspect of the underlying reality such that actual different things happen in the world, and b) someone attempting to change the judgments of another person by inputting the right series of bits into them.

Optimizing stories is really about a mono-focus on optimizing the specific corners of reality which exists inside human heads.

Of course, truly effective tea plus a well-conveyed story about its great properties will generate more sales than effective tea or a good story alone.

Sometimes not, I think. It's almost like a measure of the efficiency / effectiveness of the given market. If the market is really good at recognizing reality, then you don't need to tell a story. (Basic software libraries are like that: do they give the compute the right thing? If yes, then it's good.) If the market is not at recognizing reality, then creating stories is often way cheaper than then doing the real thing. (And also transfers better across domains.)

I don't like to work with software developers who believe that just because a software package computes the right thing it's good.

I care about many attributes of a package from it's documentation, it's API and it's likely future maintanence.

I think it might be interesting to discuss how story analysis differs from signalling analysis since I expect most people on Less Wrong to be extremely familiar with this. One difference is that people are happy to be given a story about you even if it is imperfect so that they can slot you into a box. Another is that signalling analysis focuses on whether something makes you look good or bad, while story analysis focuses on how engaging a narrative is. It also focuses more on how cultural tropes shape perspectives - ie. the romanticisation of bank robbers.

I feel you ignore that stories are central for personal motivation. If you keep the story of why you are doing what you are doing in life separate from yourself you are likely going to suffer from a lot of akrasia.

I don’t understand how this post makes that mistake - what’s an example of something Ruby said that’s inconsistent with this?

Personal motivation isn't in the list of "optimizing for stories" and and it assumes that you can optimize for success at your stated goal without story telling.

The word akrasia doesn't even appear in the post when it's vitally important as many people in this community suffer from akrasia as a result of lacking a good story to tell themselves that involves them engaging in the behavior that they consider to be desireable based on factual analysis.