Even if it were 1/10, it might be the most important 1/10. Something like that is in fact plausible: if someone were optimally trying to mostly look factual while pushing a political agenda, they would probably sort statements by ratio of [political benefit of lying] / [expected cost of being caught lying], pick a threshold, and lie whenever that ratio exceeds the threshold; and political benefit, as evaluated by this hypothetical journalist-hack, likely correlates with importance to the reader.
I would put it this way: being vulnerable is a probably-unfortunate side-effect of a means to an end, not an end in itself, and it's usually worth tracking just what end you have in mind. (And, yes, if you had a cost-free alternative means that achieved the same result but didn't make you vulnerable, then that would be an improvement.) For example: "telling someone a secret that would enable them to shame you for it", or "letting yourself rely on someone else to take care of a thing for you [such that if they fell through, it would hurt you]", or "letting yourself care about someone else's judgment of you". There are situations where each of these is an unavoidable part of a plan with positive expected value, and situations where they create needless risks with no benefit.
Let's see if I can capture the good parts of potential counter-stances:
I think the conclusion stands: in all circumstances, actual vulnerability is something you'd like to minimize; sometimes it's correct to do things that look like seeking out vulnerability, but on closer examination you're always seeking out something else that happens to be correlated (or to look like it's correlated) with actual vulnerability, which is something you tolerate if, and only if, there aren't better choices.
Suggestion for how to pose the first problem: “Imagine that someone places a large number of mirrors around the Earth so that the same sunlight hits the earth as before, but now it lands evenly spread around the earth’s surface instead of hitting the daytime half of the earth.” And, probably: “The mirrors don’t reflect any of Earth’s radiation back onto the earth.”
Protip: replace the “x” in the URL with “xcancel”. Currently works well.
It seems like the move for height would be to make men taller and women shorter? Or to increase the sexually dimorphic difference, and separately pick the optimal average.
An argument in favor of it is, "free association" is inherently a fuzzy human thing, where the process is just thinking for a bit and seeing what you come up with and at some point declaring victory; there is nothing in it that could possibly guarantee correctness. Arguably, anyone who encounters the term should be conscious of this, and therefore notice that it's an inappropriate step in a logical argument that purports to establish high certainty. Perhaps even notice that the term itself is paradoxical: in a logical context, "exhaustion" must be a rigorous process, but "free association" is inherently unrigorous.
I'm not sure if I buy the argument. The author of "The Design of Everyday Things" warns against being too clever with names and assuming that normal people will get the reference you intend. But... I dunno.
Asking Claude for examples
There are languages out there in which the epistemic status is a mandatory part of a sentence ("I've seen", "so I have been told", "I think", etc.)
Is this true? Examples?
Yes, this is absolutely true! These grammatical features are called evidentials or evidentiality markers, and they're mandatory in many languages around the world.
Turkish is a classic example. It has two past tenses that encode evidentiality:
So "geldi" means "he came" (and I witnessed it), while "gelmiş" means "he came" (but I didn't see it—I heard about it or saw evidence of it).
Quechua (Andean languages) has several evidential markers:
Tibetan has a rich evidential system distinguishing personal knowledge, inference, and hearsay.
Japanese has evidential markers too, though they're sometimes considered less grammaticalized:
Bulgarian and other Balkan languages have "renarrative" mood marking information learned from others rather than directly experienced.
The key point is that in these languages, you often cannot make a statement about a past event without indicating your epistemic basis for the claim. Leaving it out would be grammatically incomplete, like forgetting to conjugate a verb in English. This forces speakers to constantly track and communicate their source of knowledge.
Interesting.
Hmmph. If he wants to push people to do more research so that they can make statements without any such qualifiers—or to shut up when they haven't done enough research to have anything useful to say—then I may sympathize. If he wants them to make themselves sound more certain than they are, then I oppose.
Rescue the girl and plan to explain to the wealthy people what happened. Possibly try to bring her with him, for purposes including lending credence to his story.
This gets subtle. I can think of several cases where journalists sat on what would have been delicious scandals that should be good for a career, for what look like political reasons. That said, if one looks closer, it's plausible that, in each case, they reasoned (plausibly correctly) that it would not have actually been good for their career to publish it, because they would have faced backlash (for political/tribal reasons), and possibly their editors (if applicable) would have refused to allow it. I imagine there is partial but incomplete equivalence between this kind of "externally imposed political motivation" versus "internalized political motivation", and it may be worth tracking the difference.
That's for omitting stories. For lying... On priors, that difference of external vs internal political motivation would be important: the latter would encourage a journalist to come up with new lies and use them, while the former would mostly just make them go along with lies that the rest of their tribe is already telling. I do see plenty of "going along with lies" and not much innovative mendacity; I'll note that the "lies" I refer to are usually "not technically false, but cherry-picked and/or misleadingly phrased, which a normal person will hear and predictably come away believing a statement that is false; and which a journalist who felt a strong duty to tell the truth as best they could would not say absent stronger external pressure". (See Zvi on bounded distrust.)