Zack_M_Davis

Comments

[Book Review] "The Bell Curve" by Charles Murray

I'm not advocating lying.

I understand that. I cited a Sequences post that has the word "lies" in the title, but I'm claiming that the mechanism described in the cited posts—that distortions on one topic can spread to both adjacent topics, and to people's understanding of what reasoning looks like—can apply more generally to distortions that aren't direct lies.

Omitting information can be a distortion when the information would otherwise be relevant. In "A Rational Argument", Yudkowsky gives the example of an election campaign manager publishing survey responses from their candidate, but omitting one question which would make their candidate look bad, which Yudkowsky describes as "cross[ing] the line between rationality and rationalization" (!). This is a very high standard—but what made the Sequences so valuable, is that they taught people the counterintuitive idea that this standard exists. I think there's a lot of value in aspiring to hold one's public reasoning to that standard.

Not infinite value, of course! If I knew for a fact that Godzilla will destroy the world if I cite a book that I would otherwise would have cited as genuinely relevant, then fine, for the sake of the sake of the world, I can not cite the book.

Maybe we just quantitatively disagree on how tough Godzilla is and how large the costs of distortions are? Maybe you're happy to throw Sargon of Akkad under the bus, but when Steve Hsu is getting thrown under the bus, I think that's a serious problem for the future of humanity. I think this is actually worth a fight.

With my own resources and my own name (and a pen name), I'm fighting. If someone else doesn't want to fight with their name and their resources, I'm happy to listen to suggestions for how people with different risk tolerances can cooperate to not step on each other's toes! In the case of the shared resource of this website, if the Frontpage/Personal distinction isn't strong enough, then sure, "This is on our Banned Topics list; take it to /r/TheMotte, you guys" could be another point on the compromise curve. What I would hope for from the people playing the sneaky consequentialist image-management strategy, is that you guys would at least acknowledge that there is a conflict and that you've chosen a side.

might fill their opinion vacuum with false claims from elsewhere, or with true claims

For more on why I think not-making-false-claims is vastly too low of a standard to aim for, see "Firming Up Not-Lying Around Its Edge-Cases Is Less Broadly Useful Than One Might Initially Think" and "Heads I Win, Tails?—Never Heard of Her".

[Book Review] "The Bell Curve" by Charles Murray

I agree that offense-takers are calibrated against Society-in-general, not particular targets.

As a less-political problem with similar structure, consider ransomware attacks. If an attacker encrypts your business's files and will sell you the encryption key for 10 Bitcoins, do you pay (in order to get your files back, as common sense and causal decision theory agree), or do you not-pay (as a galaxy-brained updateless-decision-theory play to timelessly make writing ransomware less profitable, even though that doesn't help the copy of you in this timeline)?

It's a tough call! If your business's files are sufficiently important, then I can definitely see why you'd want to pay! But if someone were to try to portray the act of paying as pro-social, that would be pretty weird. If your Society knew how, law-abiding citizens would prefer to coordinate not to pay attackers, which is why the U.S. Treasury Department is cracking down on facilitating ransomware payments. But if that's not an option ...

our behavior [...] punishment against us [...] some other entity that we shouldn't care much about

If coordinating to resist extortion isn't an option, that makes me very interested in trying to minimize the extent to which there is a collective "us". "We" should be emphasizing that rationality is a subject matter that anyone can study, rather than trying to get people to join our robot cult and be subject to the commands and PR concerns of our leaders. Hopefully that way, people playing a sneaky consequentialist image-management strategy and people playing a Just Get The Goddamned Right Answer strategy can at least avoid being at each other's throats fighting over who owns the "rationalist" brand name.

[Book Review] "The Bell Curve" by Charles Murray

But there are systemic reasons why Society gets told that hypotheses about genetically-mediated group differences are offensive, and mostly doesn't (yet?) get told that technological forecasting is offensive. (If some research says Ethnicity E has higher levels of negatively-perceived Trait T, then Ethnicity E people have an incentive to discredit the research independently of its truth value—and people who perceive themselves as being in a zero-sum conflict with Ethnicity E have an incentive to promote the research independently of its truth value.)

Steven and his coalition are betting that it's feasible to "hold the line" on only censoring the hypotheses are closely tied to political incentives like this, without doing much damage to our collective ability to think about other aspects of the world. I don't think it works as well in practice as they think it does, due to the mechanisms described in "Entangled Truths, Contagious Lies" and "Dark Side Epistemology"—you make a seemingly harmless concession one day, and five years later, you end up claiming with perfect sincerity that dolphins are fish—but I don't think it's right to dismiss the strategy as fantasy.

[Book Review] "The Bell Curve" by Charles Murray

The relevant actors aren't consciously being strategic about it, but I think their emotions are sensitive to whether the threat of being offended seems to be working. That's what the emotions are for, evolutionarily speaking. People are innately very good at this! When I babysit a friend's unruly 6-year-old child who doesn't want to put on her shoes, or talk to my mother who wishes I would call more often, or introspect on my own rage at the abject cowardice of so-called "rationalists", the functionality of emotions as a negotiating tactic is very clear to me, even if I don't have the same kind of deliberative control over my feelings as my speech (and the child and my mother don't even think of themselves as doing game theory at all).

(This in itself doesn't automatically negate your concerns, of course, but I think it's an important modeling consideration: animals like Godzilla may be less incentivizable than Homo economicus, but they're more like Homo economicus than a tornado or an avalanche.)

[Book Review] "The Bell Curve" by Charles Murray

What's with the neglect of Richard J. Herrnstein?! His name actually comes first on the cover!

My experience at and around MIRI and CFAR (inspired by Zoe Curzi's writeup of experiences at Leverage)

Personally, I lean laissez faire on moderation: I consider banning a non-spam user from the whole website to be quite serious, and that the karma system makes a decently large (but definitely not infinite!) ratio of useless-to-useful comments acceptable. Separately from that, I admit that applying different rules to celebrities would be pretty unprincipled, but I fear that my gut feeling actually is leaning that way.

My experience at and around MIRI and CFAR (inspired by Zoe Curzi's writeup of experiences at Leverage)

were he to be banned, he would not much be missed

False!—I would miss him. I agree that comments like the grandparent are not great, but Ilya is a bona fide subject matter expert (his Ph.D. advisor was Judea Pearl), so when he contributes references or explanations, that's really valuable. Why escalate to banning a user when individual bad comments can be safely downvoted to invisibility?

Feature idea: Notification when a parent comment is modified

changes their mind and updates the original comment like this: "[EDIT: Actually, B makes a good argument against]". This feature would show this information in person B's inbox, without A having to write a separate reply to their comment.

I think separate replies are usually preferable for this. (A comment is a specific thing someone said at a particular time; you shouldn't feel obligated to go back and edit something you already said in the past, just because some of the replies were good.)

Load More