I have some very mixed feelings about this post. On the one hand, I get exactly where you're coming from. On the other, I think there are genuinely important second order effects to consider.
Basically: if a public intellectual consistently tries to tell people their true-but-difficult-or-unpopular opinions, one (IMO likely) outcome is that over time, they lose (much of) their audience, and cease to be a public intellectual. But if they never tell the truth about their opinions, then their status as a public intellectual isn't doing anyone any good.
The other side of this from my POV is that successfully becoming a public intellectual involves building habits of thought and communication that can make it difficult to notice when the moment comes to really put your cards down on the table and tell the honest but hard truth. I don't follow Ball closely enough, but Kelsey Piper and Will MacAskill have, in my opinion, done amazingly well on this front overall.
I think the SSC post on Kolmogorov Complicity and the Scott Aaronson post it builds off of capture versions of a similar problem, where putting yourself in a position to help when the critical moment comes relies on otherwise going along with a sometimes unfortunate epistemic context.
Edit to add that the first Kelsey Piper quote is about feeling bad about waiting three weeks to go public with an unpopular judgment call. Meanwhile, I think we can all comfortably look at false things the mainstream authorities stuck with for years, and some they still haven't acknowledged.
It seems super costly for many public intellectuals to say all of their beliefs, and for reasons that other commenters have pointed out, giving an epistemic status might not help. What's wrong with a blanket disclaimer like "Assume that all of my claims without an epistemic status are optimized to improve discourse on the margin, rather than to convey a complete picture of my all-things-considered beliefs"?
"Assume that all of my claims without an epistemic status are optimized to improve discourse on the margin, rather than to convey a complete picture of my all-things-considered beliefs"?
This seems like it'd usually be a false sentence. I think that's less than half of why people don't say their beliefs publicly, it implies it's a calculated decision as opposed to following gradients of what's easy / non-risky.
Sure, then just add that to the disclaimer. "I may omit claims that are risky, unpopular, may be easily misinterpreted, require lots of words to justify, etc, but aim to not outright lie for such reasons"
I fear you are leaning on a piece of public infrastructure, the "personal list of disclaimers", that doesn't exist.
If you feel like I am falling short of your ideals, I'd be interested to hear it. I'm trying to live up to them basically. it's not my top priority, i'm not trying hard, but it matters to me.
Public figures are judged on outlier statements rather than average statements. They also lack control over the context in which the public learns about them. A If 80% of statements from Alice on team A look good in hindsight, and 20% look bad:
1. Team B will distribute an attention-span-length list of bad things said by Alice.
2. Team A will distribute an attention-span-length list of good things said by Alice.
3. Viewers will set bad priors based on this biased sampling, and they're not going to listen to Alice enough to overcome this prior.
4. Team A must expend some energy to fix these priors or reduce association with Alice.
Still, I agree that there is something like a moral responsibility to self-immolate in a blinding flash of honesty. This selects for amoral people in public positions, but we do often see people with resources to spare spend some of those resources on speaking their mind.
Okay. I get where you're coming from but, man, this seems like a naive take!
You've cited an interesting-to-me set of people: people who have all been extraordinarily successful at getting an impactful message out to new audiences, in particular audiences who wouldn't have been drawn into the existing LW sphere on its own. Piper, MacAskill and Ball are all great communicators. A big part of that communication skill is matching your message to its audience. My hypothesis is that the behavior you're critiquing is driven by their skill at figuring out what people need to hear, and when, to excite them and move them in the direction that they need to go in order to make the world a better place. Maybe sometimes they're not perfect at it but they're a hell of a lot better than you and me.
All you have to do is tell your audience what you’re doing!
Why do you think this will work? In the political sphere especially, motivations are extraordinarily scrutinized. Maybe Piper and MacAskill could get away with this, but I suspect Ball could not; maybe he can sneak a few things in now that he's proven himself, but his ability to say anything about his motivations now speaks to his extraordinary communications skill getting where he's gotten. (Also, you shouldn't assume Ball was spinning before and is now telling the truth: he knows that "I'm a straight shooter" is what 80k's audience wants to hear; he may be forced to spin whatever he said to folks he works with, etc.)
I don't disagree that people should lean more in the motivation truthfulness direction that you're pushing for. I also want people to lean more in that direction, and try to hold myself and my teams to a higher standard of truthfulness in comms every day. But it's very much a spectrum, not black and white, depending on the audience. In my speaking and writing I've ~always had to change my message depending on the forum and I think the best communicators are ones who know this intimately and craft a message that people both need to hear and want to hear.
Upvoted, but disagree (well, kind of - I'd like to live in that world, but I don't think it's good advice today).
Once someone is well-known, and especially if they make a living and/or invest their self-identity in their public acceptance, they are well-advised to moderate themselves, and to stay in the overton window of their readers (even while pushing it in good directions for some readers). Even if they are convinced enough to tell their family something, they may want to water it down a bit so the public doesn't turn on them.
Speech is an action, and public speech has consequences beyond conveying information.
I congratulate you on realizing that there is no fully-trustworthy writer or authority. We're all humans, and we all have some motives that are not purely prosocial.
Part of a comment on Substack I just made that seems worth bringing over:
Quite possibly I *was* anchoring too hard on relatively big names with >=stable careers (afaict) and so am not being fair to those for whom there's a legit strong tradeoff between e.g. financial security and candidness. This is very fair. I guess I would emphasize the "virtue" part and say like "well how close are you really to the Pareto frontier of candidness and other things you value?"
I've decided not to tag anyone I use as an example in this post because I have a sense that might create a felt burden to reply and, again:
To be clear, this post isn’t intended as a moral condemnation of past behavior because, again, my sense is that media figures and intellectuals - ~certainly those I reference in this piece - genuinely believe themselves to be doing right by their readers and the world.
But feel free to let me know if this seems like the wrong call.
Intro
This Twitter thread from Kelsey Piper has been reverberating around my psyche since its inception, almost six years now.
You should read the whole thing for more context, but here are the important tweets:
I really like Kelsey Piper. She’s “based” as the kids (and I) say. I think she was trying to do her best by her own lights during this whole episode, and she deserves major props for basically broadcasting her mistakes in clear language to her tens of thousands of followers so people like me can write posts like this. And I deeply respect and admire her and her work.
But:
Easier said than done, hindsight is 20/20, etc., but I basically agree that she fucked up.[1]
The reason I’m writing this post now is that it’s become increasingly apparent to me that this kind of isn’t a one-off, and it’s not even an n-off for some modest n. It’s not out of the norm.
Claim
Rather, public intellectuals, including those I respect and admire, regularly communicate information to their audiences and the public that is fundamentally different from their true beliefs, and they should stop doing that.
I haven’t interviewed anyone for this post so take this with a large grain of salt, but my impression and suspicion is that, to public intellectuals, broadly, it’s not even considered a bad thing; rather it’s the relatively above-board and affirmatively endorsed modus operandi.
Indeed, PIs have reasonable and plausible (if implicit) reasons for thinking that being less than candid about their genuine beliefs is a good, just, and important part of the job.
The problem is that they’re wrong.
To be clear, this post isn’t intended as a moral condemnation of past behavior because, again, my sense is that media figures and intellectuals - ~certainly those I reference in this piece - genuinely believe themselves to be doing right by their readers and the world.
A few more examples
Will MacAskill
Jump forward to 2022 and Will MacAskill, whom I also greatly respect and admire, is on the 80,000 Hours podcast for the fourth time. During the episode, MacAskill notes that his first book Doing Good Better was significantly different from what “the most accurate book…fully representing my and colleagues’ EA thought” would have looked like, in part thanks to demands from the publisher (bolding mine):
Wait what? It was a throwaway line, a minor anecdote, but if I’m remembering correctly I physically stopped walking when I heard this section.
The striking thing (to me, at least) wasn’t that a published book be slightly out of date with respect to the authors’ thinking - the publishing process is long and arduous - or that the publisher forced out consideration of animal welfare.
It was that, to the best of my knowledge (!), Will never made a significant effort to tell the public about all this until the topic came up eight years after publication. See the following footnote for more: [2]
It’s not like he didn’t have a platform or the ability to write or thought that nobody was reading the book.
Doing Good Better has almost 8,000 reviews on Goodreads and another 1,300 or so on Amazon. The top three LLMs estimate 75k, 135k, and 185k sales respectively. Between when Doing Good Better was published and when that podcast interview came out, Will published something like 33 EA Forum Posts and 29 Google Scholar-recognized publications. Bro is a machine.
And Will is steeped deeply in the culture of his own founding - EA emphasizes candidness, honesty, and clarity; people put “epistemic status: [whatever]” at the top of blog posts. I don’t personally know Will (sad) but my strong overall impression is that he’s a totally earnest and honest guy.
Unfortunately I’m not really advancing an explanation of what I’m critiquing in this post. As mentioned before, I haven’t interviewed[3] anyone and can’t see inside Will’s brain or anyone else’s.
But I can speculate, and my speculation is that clarifying Doing Good Better post-publication (i.e. by writing publicly somewhere that it was a bit out of date with respect to his thinking and that the publisher made him cut important, substantive material on animal welfare) never even registered as the kind of thing he might owe his audience.
Dean Ball
To beat a dead horse, I really like and respect Piper and MacAskill.
I just don’t know Ball’s work nearly as well, and the little that I do know suggests that we have substantial and fundamental disagreements about AI policy, at the very least.
But he was recently on the 80,000 Hours Podcast (for 3 hours) and I came away basically thinking “this guy is not insane and (to quote my own tweet), “probably way above replacement level for “Trump admin-approved intellectual””[4]
All this is to say that I don’t have it out for the guy, just as I don’t have it out for Piper or MacAskill.
But part of the interview drove me insane, to the point of recording a 12 minute
rantvoice memo that is the proximate reason for me writing this post.Here’s the first bit (bolding mine):
Ahhhhhh! Hard to get a clearer example than this.
Ball is, in a purely neutral and descriptive sense, reporting candidly that he not merely wrote in a way or style that his audience could understand but substantively modified his core claims to be different than those which were the true causes of his beliefs and policy positions.
Not about lying
I actually want to pick out the hyphened segment “and all that stuff is also true” because it’s an important feature of both the underlying phenomenon I’m pointing at and my argument about it.
As far as I can tell, Ball never lied - just as Piper and MacAskill never lied.
At one point I was using the term “lie by omission” for all this stuff, but I’ve since decided that’s not really right either. The point here is just that literally endorsing every claim you write doesn’t alone imply epistemic candidness (although it might be ~necessary for it).
Ball, pt 2
Ok, let’s bring in the second of Ball’s noteworthy sections. This time Rob does identify Ball’s takes as at least potentially wrong in some sense.
Sorry for the long quote but nothing really felt right to cut (again, bolding mine):
Not sure we need much analysis or explanation here; Ball is straightforwardly saying that he neglects to tell his audience about substantive, important, relevant beliefs he has because of…some notion of confidence or justifiability. Needless to say I don’t find his explanation/justification very compelling here.
Of course he has reasons for doing this, but I think those reasons are bad and wrong. So without further ado…
The case against the status quo
It’s misleading and that’s bad
This isn’t an especially clever or interesting point, but it’s the most basic and fundamental reason that “sub-candidness” as we might call it is bad.
No one else knows they’re playing a game
To a first approximation, podcasts and Substack articles and Vox pieces are just normal-ass person-to-person communication, author to reader.
As a Substacker or journalist or podcast host or think tank guy or whatever, you can decide to play any game you want - putting on a persona, playing devil’s advocate, playing the role of authority who doesn’t make any claims until virtually certain.
All this is fine, but only if you tell your audience what you’re doing.
Every instance of sub-candidness I go through above could have been avoided by simply publicly stating somewhere the “game” the author chose to play.
I think Piper should have told the public her genuine thoughts motivating personal behavior vis a vis Covid, but I wouldn’t be objecting on the grounds I am in this post if she had said something like “In this article I am making claims that I find to be robustly objectively defensible and withholding information that I believe because it doesn’t meet that standard.”
Part of the point of this kind of disclaimer is that it might encourage readers to go “wait but what do you actually think” and then you, Mr(s). Public Intellectual might decide to tell them.
What would a reasonable reader infer?
Merely saying propositions that you literally endorse in isolation is ~necessary but not at all sufficient for conveying information faithfully and accurately.
The relevant question public intellectuals need to ask is: “What would a reasonable reader infer or believe both (a) about the world and (b) about my beliefs after consuming this media?”
Of course sometimes there are going to be edge cases and legitimate disagreements about the answer, but I think in general things are clear enough to be action-guiding in the right way.
I think some (partial) answers to that question, in our example cases, are:
In each case, I claim, the reader would have been mistaken, and foreseeably so. And “foreseeably causing a reader to have false beliefs” seems like a pretty good definition of “misleading.”
Public intellectuals are often domain experts relative to their audience
Again, I can’t look inside anyone’s brain, but I suspect that public intellectuals often err by incorrectly modeling their audience.
If you’re Matt Yglesias, some of your readers are going to be fellow policy wonk polymaths with a lot of context on the subject matter you’re tackling today, but the strong majority are going to have way less knowledge and context on whatever you’re writing about
This is true in general; by the time I am writing a blog post about something, even if I had no expertise to start with, I am something of an expert in a relative sense now. The same is true of generalist journalists who are covering a specific story, or podcast hosts who have spent a week preparing for an interview.
This seems trivial when made as an explicit claim, and I don’t expect anyone to really disagree with it, but really grokking this asymmetry entails a few relevant points that don’t in fact seem to reflect the state of play:
1) Your audience doesn’t know what they don’t know
So your decision to not even mention/cover some aspect of the thing isn’t generally going to come across as “wink wink look up this one yourself” - it’s just a total blind spot far from mere consideration. MacAskill’s readers didn’t even know what was on the table to begin with; if you don’t bring up longtermist ideas and instead mainly talk about global poverty, readers are going to reasonably, implicitly assume that you don’t think the long-term future is extremely important. That proposition never even crossed their mind for them to evaluate.
2) Your expertise about some topic gives you genuinely good epistemic reason to share your true beliefs in earnest
“Epistemic humility” is a positively-valenced term, but too much in a given circumstance is just straightforwardly bad and wrong. Kelsey Piper shouldn’t have deferred to the consensus vibe because she’s the kind of person who’s supposed to decide the consensus vibe.
3) Like it or not, people trust you to have your own takes - that’s why they’re reading your thing
It is substantively relevant that the current media environment (at least in the anglosphere) is ridiculously rich and saturated. There are a lot of sources of information. People can choose to listen/read/vibe with a million other things, and often a thousand about the same topic. They chose to read you because for whatever reason they want to know what you think.
In other words, your take being yours and not like an amalgam of yours + the consensus + the high status thing is already built into the implicit relationship.
You’re (probably) all they’ve got
A partially-overlapping-with-the-above point I want to drive home is that, in general, you (public intellectual) are the means by which your audience can pick up on radical but plausible ideas, or subtle vibes, or whatever collection of vague evidence is informing your intuition, or anything else.
Insofar as you think other people in some sense “should” believe what you believe - (ideally at least, or if they had more information and time and energy, or something like that) or at least hear the case for, your views, this is it.
Maybe you’re part of the EA- or rationalist- sphere and have a bunch of uncommon beliefs about the relatively near future (perhaps along the lines of “P[most humans die or US GDP increases >10x by 2030] >= 50%”) and you’re talking to a “normie” about AI (perhaps a very important normie like a member of Congress).
You can try use arguments you don’t really find convincing or important, or moderate your opinions to seem more credible, or anything, but to what end?
So that they can leave the interaction without even the in-principle opportunity of coming closer to sharing your actual beliefs - beliefs you want them to have?
And the theory is that somehow this helps get them on the road to having correct takes, somehow, by your own lights?
Because maybe someone in the future will do the thing you’re avoiding - that is, sharing one’s actual reasons for holding actual views?[5]
All of the above mostly holds regardless of who you are, but if you’re a public intellectual, that is your role in society and the job you have chosen or been cast into, for better or worse.
Is your theory of change dependent on someone else being just like you but with more chutzpah? If so, is there a good reason you shouldn’t be the one to have the chutzpah?
This is it!
You can have your cake and eat it too
At some level what I’m calling for involves intellectual courage and selflessness, but in a more substantive and boring sense it’s not especially demanding.
That’s because you don’t have to choose between candidness and other things you find valuable in communication like “conveying the public health consensus” or “using concepts and arguments my readers will be familiar with” or “not presenting low-confidence intuitions as well-considered theses.”
All you have to do is tell your audience what you’re doing!
You can explicitly label claims or anecdotes or vibes as based on intuition or speculation or nothing at all. You can present arguments you don’t endorse but want the reader to hear or consider for whatever reason, or report claims from officials you low key suspect aren’t really true.
You can even use explicit numerical probabilities to convey degrees of certainty! One frustrating element about our example cases is that Piper, MacAskill, and Ball are all exceptionally bright and numerate and comfortable with probabilities - asking them to use it to clarify and highlight the epistemic stance they’re coming from doesn’t seem extremely burdensome
And more generally, setting aside numerical probabilities for a moment:
It will increase the word count of your thing by 2%, fine. That’s a very small price to pay.
Not a trivial ask
Let me go back to the “At some level what I’m calling for can involve intellectual and social courage” bit from the previous section.
The universe makes no guarantees that earnestness will be met with productive, kind, and generous engagement. As a public intellectual, you might find yourself in the position of “actually believing” things that are going to sound bad in one way or another.
To be honest, I don’t have a totally worked-out theory or principle that seems appropriately action-guiding in almost all situations.
In some stylized Cultural Revolution scenario with a literal mob at your door waiting to literally torture and kill you if you say the wrong thing, I think you should generally just lie and save your ass and then report that’s what you did in the future if the situation ever improves.
But I guess the things I do stand by are that:
Appendix: transcript of original rant
It has a bit of exasperated energy that the above post lacks, so here is a mildly cleaned up version (fewer filler words basically) of my original rant. Enjoy:
Here’s a relevant post of hers by the way. To be clear, far better than what other journalists were doing at the time!
Ok I still stand by this claim but “show that something wasn’t said” is a hard thing to do. To be clear, I’m quite sure that I didn’t - and still don’t - know of any public statement by MacAskill basically conveying the points that either:
Of course that doesn’t imply such a statement doesn’t exist.
I ran Claude Opus 4.5 thinking research, GPT-5.2-Thinking-Deep Research, and Gemini-3-Pro-Preview-Research on the topic and initially got some confusing contradictory vibes, but things seem to ground out as “no we can’t find anything with Will making either of the two points listed above either”.
Can’t share Claude Convo directly…
...…but in the interest of completeness here’s a google doc with a bunch of screenshots. My takeaway after pushing Opus on points of confusion is basically:
Although by all means if you are mentioned in this post or otherwise have special insight and want to talk, please feel free to email me! aaronb50 [at] gmail.com
I was vaguely suspicious he might have been basically bending his views and tone to the host and the audience on 80k, but I just ran the transcript and his 30 most popular Substack posts through Gemini-3-Pro and at least according to that the guy is pretty consistent. The conclusion of that LLM Message is:
And in rare circumstances they might reason themselves to the right answer, but this is the epistemic equivalent of planning your retirement around winning the lottery at 65.
As an author, MacAskill has good reason not to upset or antagonize the publisher, but something along the lines of “here’s how my thinking has evolved over the course of writing this book” or “bits I didn’t have space for” articles on his website or “go to this url to read more [at the end of the book]” (like Yudkowsky and Soares did recently with If Anyone Builds It, Everyone Dies and ifanyonebuildsit.com/resources) seem like they probably would have been fine to ship (I admit I’m less sure around this case).