A distinction that I get a lot of value out of is the difference between private beliefs and public beliefs.

Public vs. Private

A public belief is a proposition that someone thinks is true, and justifying on the basis of legible info and reasoning. If X is a public belief, then implicit claim is "not only do I think that X is true, I think that any right thinking person who examines the evidence should come to conclude X." 

If someone disagrees with you about a public belief, it is prosocial and epistemically virtuous to defend your claim and to debate the matter on a public forum. Public beliefs, if consensus is reached about them, can be added to the sum total of human knowledge for others to take for granted and then build on.

A private belief is proposition that someone thinks is true based on their own private or illegible info and reasoning. In this case, the implicit claim is "given my own read of the evidence, I happen to think X. But I don't think that the arguments that I've offered are necessarily sufficient to convince a third party. I'm not claiming that you should believe this, I'm merely providing you the true information that I believe it." 

If someone disagrees with you about a private belief, there might or might not be a fruitful discussion to be had about it, but it is also important to be able to "agree to disagree."

An example

I think that Circling meaningfully develops real skills of introspection, subtle interpersonal sensitivity, and clarity of map-territory distinctions. I further think that Circling is relevant to the Art of Rationality. 

There have been some write-ups that describe why I think this is the case, but I don't know that any of them are persuasive. If a person is curious about why I might be interested in Circling, I think this post is a decent overview. But crucially, I don't think that the evidence presented should be sufficient to convince a skeptic. 

I would say that I have a private belief that Circling is useful. It is actually my calibrated view, based on my personal experiences and my reasoning about those experiences. But by stating that belief, I am not at all making a social bid that others believe it too. 

A special case: cloaks

There's a special case of having a private belief that, at CFAR, we used to refer to as "having a cloak".

If you are pursuing some ambitious project or personal development goals, it can be damaging to tell people about or justify your ambitions. Many ambitious goals are butterfly ideas, that need to be handled gently. Your own sense of what is possible might be fragile, and you need to nurture it. 

And, for many people, sharing their ambitions puts them in a mindset of asking themselves to justify whether they're cool enough to succeed, or needing to fend off a kind of social pressure from causal pessimism. All of which is wasted motion.

So it's useful to have a "cloak": an understanding that your plans and hopes can be private beliefs, that are no one's business but yours.

Sometimes that cloak can be keeping what you're working on a secret (as Paul Graham suggests in What You'll Wish You'd Known). 

Alternatively, it's useful to have a true(!), but incomplete, description about what you're aiming to do, that gives others a bucket for conceptualizing your actions, while you also have private, more ambitious plans. (Paul Graham also recommends this sort of cloak, in his "tactics" section of Frighteningly Ambitious Startup Ideas.)

A motivating example is Amazon.com. I bet that back in 1999, Jeff Bezos had at least a glimmer of the the long term future of Amazon. But if Jeff Bezos had outright declared, "Amazon's plan is to build an online bookstore, and eventually conquer almost the whole online retail economy (which, by the way, is going to be a double digit percentage of all retail, by 2020) and become one of the top 10 most valuable companies in the world", he would have gotten incredulous reactions. Lots of people would have scoffed at Bezos's delusions of grandeur, many would have mocked him outright. Even if this was the actual plan and actual goal, declaring his ambitions for Amazon outright, would not have helped them succeed. 

None of those people needed to believe that that kind of growth was possible, in order for Amazon to succeed. The only people who needed to believe is were Bezos and the core team at Amazon.

So instead, Amazon in 1999 has the cloak of just being an online bookstore, interesting but unobjectionable, while internally, they're working towards something much bigger than that. 

In general, it's helpful to be able to believe things about yourself, and your abilities, that you don't have to justify to anyone else.

Why does this matter?

I think failing to make a distinction between public and private beliefs can hamper both interpersonal communication and, more importantly, people's internal ability to think.

Personally, being able to say "this is a think that I think is true, but I definitely don't think that I've made the case strongly enough here, for you to be convinced" gives me space to express more of my ideas, without skirting close to conflict or affront.

Further, I think lots of folks implicitly feel like they "aren't allowed" have an opinion about something unless it is a defensible public belief that for which they are prepared to advocate in the public forum. Accordingly, they have a very high bar for letting themselves believe something, or at least to say it out loud. 

I suspect that this hobbles their thinking, in much the same way that knowing someone will read your diary entries causes your diary to be less reflective of your true thoughts. If you have a feeling that you have to justify all of your conclusions, there are lines of thought that you won't follow, because you can simulate your friends frowning at you for being a bad rationalist. 

Personally (a private belief!), I think that rigor is extremely valuable, but it is even more important to be honest with myself about what I actually think is true, separately from what I think is socially defensible.

Wait, isn't it bad to let people have beliefs that they don't need to defend?

I imagine that some readers might object to giving social permission for people to have beliefs that they don't need to justify. 

"Isn't part of what's wonderful about rationality that we try to be explicit enough that we can reason about anything? Isn't a major cause of the world's problems that beliefs are rarely held to any standard of evidence, and therefore people believe all kinds of random stuff? This post kind of sounds like you recommend that we stop holding people to standards of evidence."

The key thing for me is that private beliefs are your own personal model of the world, and you should never expect, or insist, that other people act on them. 

It is always out of bounds to expect or demand that other people adopt your beliefs without offering justification. 

If you are making claims that you want to influence other people's actions, it is incumbent upon you to justify them. 

Everyone has an unalienable right to hold the belief, that, for instance, polyamory is bad for people's psychology, on whatever basis they find compelling, including of intuitive or illegible reasoning. You are by all means allowed to decide whether or not to be polyamorous yourself, for those reasons. It is quite bad if a person feels pressured into being poly despite their intuitive sense that it's harmful for them or for others.

But, by my proposed social norms, if you want to go further and suggest that other people should be prevented from being polyamorous, or that it should be discouraged in your community, it is on you to justify that, to put forward a public position with reasons that can be critiqued and debated.

And of course, a person could always declare some of their private beliefs, not mainly in the hopes of convincing those that disagree, but rather to find and filter for the other people that share their view, so that you can together form spaces where that view can be assumed and built on. eg "I can't (yet) articulate why I think that self-honesty is so crucial to the world saving project with full rigor, but if you also have that intuition, maybe we can work together on building and refining a culture that promotes self-honesty."

New to LessWrong?

New Comment
30 comments, sorted by Click to highlight new comments since: Today at 11:28 PM

I feel like the terms for public/private beliefs are gonna crash with the fairly established terminology for independent impressions and all-things-considered beliefs (I've seen these referred to as "public" and "private" beliefs before, but I can't remember the source). The idea is that sometimes you want to report your independent impressions rather than your Aumann-updated model of the world, because if everyone does the latter it can lead to double-counting of evidence and information cascades.

Information cascades develop consistently in a laboratory situation in which other incentives to go along with the crowd are minimized. Some decision sequences result in reverse cascades, where initial misrepresentative signals start a chain of incorrect [but individually rational] decisions that is not broken by more representative signals received later. - (Anderson & Holt, 1998)

I don't want people to conflate the above socioepistemological ideas with the importantly different concepts in this post, so I prefer flagging my beliefs as "legible" or "illegible" to give a sense of how productive/educational I expect talking to me about them will be.

Bonus point: The failure mode of not admitting your own illegible/private beliefs can lead to myopic empiricism, whereby you stunt your epistemic growth by refusing to update on a large class of evidence. Severe cases often exhibit an unnatural tendency to consume academic papers over blog posts.

I guess now that I'm thinking in terms of "in what ways would these terms cause problems if adopted as jargon", there's a bit of collision with beliefs that are private/public specifically because they are based off public/private information. (This overlaps with beliefs being legible and persuasive, but it's not a perfect overlap)

My recommendation for a category that is missing:  public beliefs which are harmful to express. Suppose we specifically target this aspect of your public belief definition:

"not only do I think that X is true, I think that any right thinking person who examines the evidence should come to conclude X."

What if "right thinking person" is a fraction of a fraction of the population?  What do we do when the belief violates some "sacred value" held by the general populace?  In these cases, expressing even the most solidly backed belief publicly can have huge negative consequences.

Sure, it might be statistically better in the long run if these beliefs were expressed, but in the short term, you can lose your livelihood (or worse) for expressing them.

Came here to make approximately this comment.

"Not only do I think that X is true, I think that any right thinking person who examines the evidence should come to conclude X" is a superset that itself contains both public and private beliefs.

It's hard to do contemporary examples, for obvious reasons, but an easy past example might be something like "we should not treat every member of Group X as if they are in no meaningful ways different from the median member of Group X."

(against "isms")

I think there's an important distinction to be made between legible and illegible (but still probably at least somewhat valid) reasons for one's beliefs, but I think "public" and "private" is the wrong set of labels to use for that distinction, specifically because it will engender confusion about concrete-and-legible-but-not-speakable-aloud beliefs.

Or, to put it another way, I don't want people to think that any belief that is not spoken aloud is not spoken aloud because it is illegible.

Or to put it yet another way: sometimes, illegibility-in-practice is because I have not yet done the work to put my reasoning into clear terms.  But sometimes, illegibility-in-practice is the fault of the listener or the society, for being unable (sometimes violently) to handle what are, in fact, perfectly clear and reasonable arguments.

I think something bad will happen if we crystallize the distinction above under the terms "public" and "private."  I think that the distinction above is real and good and deserves handles, though.

My paraphrase: 

Taking the term "private" literally, there are at least two reasons why a belief might be private. 

The first is because of the reasons that this posts outlines: because one's reasoning is illegible, because they don't think that they have or can justify their reasoning by objective standards.

But another reason why a belief might be private is that, even though you think the arguments are solid and defensible, you expect that other people will have a strong negative reaction to your stating the belief, for non-epistemic reasons. There are always, in all times, things you can't say.

And it is pretty bad if it becomes a default assumption that, if a person won't share a belief, its because their arguments are implicit or their reasons not-fully-justified. That's just not true.

..

Overall, that seems right to me, and I'm happy to find a different word for "private beliefs" that doesn't have that problem. Any suggestions?

Maybe "personal"?

Overall, I'm not trying to crystalize terms here so much as point at distinctions. Rather than say "this is a personal/private belief" I currently think people should just spell out "this is a thing that I believe, but I don't think that I've made the arguments well enough to think that you should buy it yet." But I agree that if this becomes commonplace, it will become jargon, and we want to pick the jargon well.

 

Your paraphrase passes my ITT.

I don't have any good suggestions for terms to replace the ones you used; I do reiterate that the distinction you're pointing out in the OP is a real and useful one.

I'm not claiming that you should believe this, I'm merely providing you the true information that I believe it.

Something feels off to me about this notion of "a belief about the object level that other people aren't expected to share" from an Aumann's Agreement Theorem point of view - the beliefs of other rational agents are, in fact, enormous updates about the world! Of course Aumannian conversations happen exceedingly rarely outside of environments with tight verifiable feedback loops about correctness, so in the real world maybe something like these norms is needed, but the part where "agreeing to disagree" gets flagged as extremely not what ideal agents would be doing seems important to me.

Curated. I initially thought this was basically a repeat of Vaniver's "Public Positions and Private Guts". But I found that Eli had interwoven this post with a lot of related thoughts that helped me connect the concept into different frameworks (i.e. the connection to hero licensing, and ambition).

I think having the distinction between public and private beliefs is useful for group epistemics and coordinated strategy. 

I think this is negative, for the reasons stated in my original reply to Eli.

I think the distinction Eli is drawing is great, and apt, and useful, and I definitely like having it highlighted and laid-out as in this post.  I do not want to erase this distinction; I very much want good words to track it.

But I think if we solidify "public beliefs" and "private beliefs" as jargon, in accordance with the definitions given in the post, we're going to erode our ability (both as a community and as individuals) to see that these are not the two buckets.  It's enforcing/strengthening a bucket error, that beliefs not spoken aloud are private-in-the-way-Eli-describes (i.e. illegible), and will make people more likely to assume that anything not expressible in public is also not explicit or rigorous in the thinker's own head.

I did not have an alternate proposal on terminology, but this seems like a place where human nature and common word usage are going to bite us in a pretty predictable fashion/seems like a place where it's not going to be easy to get people to use the terms precisely in the way that we want that would head off the confusion.

Hoping I'm wrong; happy to make small bets.

EDIT: just went through the comments and I can see this confusion already popping up in multiple threads.

Mmm. I think when I curated this I had not been thinking in terms of the jargon here being optimal, and more been thinking about "the concepts here seem like they're getting at something important." (I didn't have a strong opinion on the jargon one way or another at the time)

I think some of the comments here have updated me towards "this post is more of an intermediate stage of grappling with the concepts its gesturing at, than the final form I'd like them to crystallize as". I think I still endorse curating it (things don't need to be perfect to get curated)

(Maybe also worth noting re: the curation decision: it so happens I'm somewhat subsidizing rationality content when curating this-particular-month, because there's been a huge upswelling of AI content on LessWrong that feels like it's drowning out everything else and it seemed important to signal boost rationality discussion. I think the post was good enough to curate without that, but it factored into the decision to do so yesterday)

I think that we should seize this opportunity to try to get our jargon right on the fly!

I think all three of Eli, Ray, and Duncan, in particular, have a shared interest in thoughtfully shaping culture in positive directions, and we all agree that this particular jargon has some problems. Let's do better than just letting this be, and exert some attention to see if we can get everything that we want here.

Brainstorming alternatives to "private belief":

  • Idiosyncratic belief?
  • Personal belief?
  • Illegible belief 
    • I don't like this because it's also not quite getting at what I want in this distinction. I might have legible argumentation, but I don't expect it to be understandable without a bunch careful explanation and backtracking to prerequisites, and I don't want to make the claim that others should believe this, right now.
    • Actually, this sort of draws out that the distinction that I want to make is less "kinds of beliefs" and more like "stance that one can take towards a belief". I believe what I believe, but additionally there's some social meta-data of whether I'm claiming that others should agree with me on the basis of public info.
    • Given that...
  • Argument stance vs. impression stance? 
  • Metadata: claim of public accessibility vs. metadata: non claim of public accessibility 

Obviously, these ideas are bad. Anyone want to help me generate better ones?

Certainly no answer on the terminology question but I am trying to understand how these "private beliefs" might be any different than a person's opinion? Perhaps the distinction is in just what degree one thinks there is considered observations and thought but I'm not sure that is a sure basis. Most opinions are not just considered random thoughts and views lacking any reasonable basis by the person holding the opinion.

Perhaps the characterization of premises/evidence could be understood as more anecdotal, non-random sample observations so potentially skewed to a special case/false conclusion? 

The other point you make about "understandable without a bunch of careful explanations" points to 1) a level of complexity that makes shared knowledge problematic and 2) a belief that is related to a highly specialized area so perhaps not fully able to fit into any type of public, widely shared data/knowledge set.

I might have legible argumentation, but I don’t expect it to be understandable without a bunch careful explanation and backtracking to prerequisites

That fits great with my definition of illegibility. This case sounds like you've clarified it enough to make it legible to yourself but not yet enough to cross inferential gaps, thus it remains illegible to other people.

this also describe math. like, the mote complicated math that have some prerequisites and person that didn't take the courses in collage or some analog will not understand.

math, by my understanding of "legibility", is VERY legible. same about programming, physics, and a whole bunch of explicitly lawful but complicated things. 

what is your understanding about that sort of things?

 

I think I was unfair. I concede it's possible to have legible argumentation that people won't understand in a short time, even if it's perfectly clarified in your head. But in my experiences interrogating my own beliefs, I think it's common that they are actually not clear (you just think they are) until you can explain them to someone else, so the term "illegible belief" may help some people properly debug themselves.

Regarding your question about math and the like... The point of having the concept of epistemic legibility is that we want to be able to "debug" articles we read, and the articles should accommodate us doing that. If we cannot debug them, they're not legible.

If your math is correct but poorly explained, I suppose I'd have to call it legible (as long as the explanations don't lead the reader astray). I won't want to grace it with that adjective, as I'm sure you understand, but that's more a matter of signaling.

By contrast, it's fine by me if you assume background knowledge, though keep in mind it's easy to assume too much (Explainers Shoot High, Aim Low).

it sometimes happen in conversations, that people talk past each other, don't notice that they both use the word X and mean two different things, and behave as if they agree on what X is but disagree on where to draw the boundary.

from my point of view, you said some things that make it clear you mean very different thing then me by "illegible". prove of theorem can't be illegible to SOMEONE. illegibility is property of the explanation, not the explanation and person. i encountered papers and posts that above my knowledge in math and computer science. i didn't understand them despite them being legible. 


you also have different approach to concepts in generally. i don't have concept because it make is easier for people to debug. i try to find concepts that reflect the territory most precisely. that is the point of concepts TO ME.

i don't sure it worth it go all the way back, and i have no intention go over you post and adding "to you" in all the places where it should be add, to make it clearer that goals are something people have, not property of the teritory. but if you want to do half of the work of that, we can continue this discussion. 

FWIW, I agree with you after reading the above.

Copying over some thoughts from a text conversation I had about this post, since that’s easier than writing them up properly. Adding section headers for readability; utterances not marked “[friend]: “ are mine.
-------------------

0. beginning

[friend]: I like this!  In particular I like the concept that it's reasonable to have beliefs that you can't prove on request, because the internet often assumes it's not

[friend]: (but also yeah, it's very important to note that if you have those then you shouldn't expect other people to take them on faith)

yeahhh

I sort of think the second thing is more important in discourses I'm in

or like

1. arguments which don’t acknowledge they’re about private beliefs

…I think in some disagreements there's a lack of acknowledgement that the kinds of arguments being made are fundamentally not a kind of thing that can convince people in the absence of direct personal experience replicating those arguments?

that these are private-belief kinds of justifications masquerading as public-belief ones

and that it doesn't make sense to have an argument about it where you think people disagreeing with you are doing something wrong

[friend]: yeah, absolutely

[friend]: I think which issue you run into more depends on specific bubbles

and it makes more sense to mutually acknowledge this

2. terminology request

...this makes me want terms for "private-belief kinds of justifications" vs. "public-belief kinds of justifications"

3. how common are truly public beliefs

also it kind of makes me wonder how common truly public beliefs really are

I kind of think it's very common for beliefs to rest in part on personal experience that's not super replicable or transferable by argument

even if the "personal experience" is like, reading papers in which X kind of thing repeatedly turns out to be true, or like, a doctor or nurse seeing a lot of patients who present a certain way and learning to have doomy feelings about some combinations of symptoms

(thinking in part here about some things [nurse friend] has said about her experience, re: the second thing)

4. when to rely on others’ private beliefs; converting private beliefs to ~public ones by demonstrating calibration

which is also now making me think of emergency situations and when you should act on someone else's private belief they can't fully justify to you

I guess if you have reason to think they're in general well calibrated then that's justified

though it's still much iffier than coming to agree with a legible argument

also possibly people can convert some kinds of private beliefs into ~public ones by demonstrating being well calibrated?

5. how common are truly public beliefs, part 2

[friend]: I think there's a fair amount of truly public beliefs, where I know something mostly because I looked it up

[friend]: or I guess even more cases where I kind-of-knew something because of illegible cultural osmosis but when I wanted to tell someone about it I looked it up and it turned out to be an easy wikipedia-findable fact

[friend]: also one can hope that if doctors/nurses see a lot of patients who present a certain way and this turns out to be a consistent sign of a specific problem, at some point someone will write this up and make it legible-ish

6. how useful are public vs. private beliefs

[friend]: ... although there's also the problem where you can make something a kind of public-belief by e.g. pointing to a paper about it, but then it turns out that people who know more about it than you have private-beliefs that actually most of the papers in that field are wrong, and this can get really complicated

[friend]: because we have a vague feeling that public-beliefs are more correct, but they aren't always

I think you need another category of "implicit beliefs".  These are things revealed by choices that don't necessarily match one's expressed nor introspected beliefs.  Relatedly, I suspect you have a sharper boundary between your private beliefs and your public (or semi-public) beliefs than many.  A whole lot of people can't hide their distaste for something they believe is harmful, even if they don't try to justify that belief publicly.

A whole lot of people can't hide their distaste for something they believe is harmful, even if they don't try to justify that belief publicly.

That a belief is private, as I'm using the word here, doesn't mean that you're trying to hide what you believe. My expressing distaste for something doesn't mean that I'm making a bid that others feel the same way that I do.

to debate the matter on a public forum

For what it's worth, that's exactly the etymology of the word forensic. Unfortunately, its modern usage seems to be exclusive to crime and its English meaning has always meant courts, not other forums, so I'm not sure it would be useful to adopt this term.

Nitpick: Wasn't "Online retail is going to become HUUUUGE" one of the things that justified the dot-com bubble? I think Amazon.com always intended to expand beyond books once their book business became stable...

I'd dispute the claim that everyone has a right to false beliefs, or at least I'd suggest it's not as simple as you suggest. There's a famous philosophical example about a ship's captain who chooses to believe that his ship can definitely make the voyage, even though it's more than a bit shaky and then it ends up sinking and causing a bunch of people to drown. You might say that the wrong action was making the voyage, not believing that his ship could make the voyage, but once you have the belief that your ship can, in many circumstances you'll be pretty much committed to (say if the captain owes a lot of money and really needs to make the voyage so he can pay it back).

But of course, claiming that people don't have a right to false beliefs isn't the same as claiming that people have to be already ready to stand and defend their beliefs, regardless of how tired they feel, how pressing their other tasks are, how much engagement they've already made on that issue or how useful they deem the engagement to be.

I don't agree that everyone has the right to false beliefs. But think that if you have false beliefs that you're not imposing on me, directly or indirectly, that's between you and your own commitment to rationality, ie not my business.

I have no right to assert that you believe any particular thing, you are fundamentally sovereign over your beliefs, even though no one can exempt you from rationality's laws.

"Imposing on me, directly or indirectly"

Seems like a pretty wide scope? Like if you're voting according to the beliefs then arguably you're imposing those beliefs indirectly on other people? I guess this could be excluded based on being less than a millionth of a decision, but curious where you draw the line.

Like if you're voting according to the beliefs then arguably you're imposing those beliefs indirectly on other people?

Any belief that motivates your vote for public policy is a public belief. 

There are some votes that are closer to aggregation of preferences (like "should our group house have a quiet hallway norms or loud hallway norms?") where you might be basically expressing a private belief. 

But in general, if you're voting on it, it's a matter of public interest, and it is impolite to rely on illegible argumentation that isn't made available for critique (though perhaps we could construct situations in which I agree that it is unavoidable). 

Now, that's interesting. Because I am somewhat tempted by the view that if you are voting on it, then it's a matter of public interest, so you have a responsibility to try to believe things that are true, but I would disagree with "it is impolite to rely on illegible argumentation that isn't made available for critique".

I’m reading the book Exclusion and Embrace by Miroslav Volf and I believe his insights are applicable to this discussion. If we exclude those who have a worldview we don’t accept then the typical responses are expulsion, assimilation (you must assimilate to me), indifference or abandonment. Embracing others with opposing or different views is rare today.