UnexpectedValues

I'm a 2nd year PhD student at Columbia. My academic interests lie in mechanism design and algorithms related to the acquisition of knowledge. I write a blog on stuff I'm interested in (such as math, philosophy, puzzles, statistics, and elections): https://ericneyman.wordpress.com/

Sequences

Pseudorandomness Contest

Wiki Contributions

Comments

Can group identity be a force for good?

I think we're disagreeing on semantics. But I'd endorse both the statements "violence is bad" and "violence is sometimes good".

Can group identity be a force for good?

I'm not sure. The strongest claim I make in that direction is that "Many in the rationalist sphere look down on tribalism and group identity." I think this is true -- I bet each of the people I named would endorse the statement "The world would be better off with a lot less tribalism."

Can group identity be a force for good?

To be clear, I'm agreeing with Eliezer; I say so in the second paragraph. But for the most part my post doesn't directly address Eliezer's essay except in passing. Instead I point out: "Yeah, the 'bad for reasoning about tribe affiliated subjects' is a drawback, but here's a benefit, at least for me."

Can group identity be a force for good?

It's true that I didn't draw a distinction between tribalism and group identity. My reason for doing so was that I thought both terms applied to my three examples. I thought a bit about the distinction between the two in my mind but didn't get very far. So I'm not sure whether the pattern I pointed out in my post is true of tribalism, or of group identity, or both. But since you pressed me, let me try to draw a distinction.

(This is an exercise for me in figuring out what I mean by these two notions; I'm not imposing these definitions on anyone.)

The word "tribalism" has a negative connotation. Why? I'd say because it draws out tendencies of tribe members to lose subjectivity and defend their tribe. (I was going to call this "irrational" behavior, but I'm not sure that's right; it's probably epistemically irrational but not necessarily instrumentally irrational.) So, maybe tribalism can be defined as a mindset of membership in a group that causes the member to react defensively to external challenges, rather than treating those challenges objectively.

(I know that I feel tribalism toward the rationalist community because of how I felt on the day that Scott Alexander took down Slate Star Codex, and when the New York Times article was published. I expect to feel similarly about EA, but haven't had anything trigger that emotional state in me about it yet. I feel a smaller amount of tribalism toward neoliberalism.)

(Note that I'm avoiding defining tribes, just tribalism, because what's relevant to my post is how I feel about the groups I mentioned, not any property of the groups themselves. If you wanted to, you could define a tribe as a group where the average member feels tribalism toward the group, or something.)

Identity is probably easier to define -- I identify with a group if I consider myself a member of it. I'm not sure which of these two notions is most relevant for the sort of pattern I point out, though.

Can group identity be a force for good?

Good point! You might be interested in how I closed off an earlier draft of this post (which makes some points I didn't make above, but which I think ended up having too high of a rhetoric to insight ratio):

 

"I don’t endorse tribalism in general, or think it’s a net positive. Tribalism strikes me as a symmetric weapon, equally wieldable by good and evil. This alone would make tribalism net neutral, but in fact tribalism corrupts, turning scouts into soldiers, making people defend their side irrespective of who’s right. And the more tribal a group becomes, the more fiercely they fight. Tribalism is a soldier of Moloch, the god of defecting in prisoner’s dilemmas.

This is somewhat in tension with my earlier claim that my tribalism is a net positive. If I claim that my tribalism is net positive, but tribalism as a whole is net negative, then I’m saying that I’m special. But everyone feels special from the inside, so you’d be right to call me out for claiming that most people who feel that their tribalism is good are wrong, but I happen to be right. I would respond by saying that among people who think carefully about tribalism, many probably have a good relationship with it. I totally understand if you don’t buy that — or if you think that I haven’t thought carefully enough about my tribalism.

But the other thing is, tribalism’s relationship with Moloch isn’t so straightforward. While on the inter-group level it breeds discord, within a tribe it fosters trust and cooperation. An American identity, and a British identity, and a Soviet identity helped fight the Nazis — just as my EA identity helps fight malaria.

So my advice on tribalism might be summarized thus: first, think carefully and critically about who the good guys are. And once you’ve done that — once you’ve joined them — a little tribalism can go a long way. Not a gallon of tribalism — beyond a certain point, sacrificing clear thinking for social cohesion becomes negative even if you’re on the good side — but a teaspoon."

Social behavior curves, equilibria, and radicalism

Thanks for mentioning Asch's conformity experiment -- it's a great example of this sort of thing! I might come back and revise it a bit to mention the experiment.

(Though here, interestingly, a participant's action isn't exactly based on the percentage of people giving the wrong answer. It sounds like having one person give the right answer was enough to make people give the right answer, almost regardless of how many people gave the wrong answer. Nevertheless, it illustrates the point that other people's behavior totally does influence most people's behavior to quite a large degree, even in pretty unexpected settings.)

Social behavior curves, equilibria, and radicalism

Yeah -- to clarify, in the last section I meant "select how radical you'll be for that issue at random." In the previous section I used "radical" to refer to a kind of person (observing that some people do have a more radical disposition than others), but yeah, I agree that there's nothing wrong with choosing your level of radicalism independently for different issues!

And yeah, there are many ways this model is incomplete. Status quo bias is one. Another is that some decisions have more than two outcomes. A third is that really this should be modeled as a network, where people are influenced by their neighbors (and I'm assuming that the network is a giant complete graph). A simple answer to your question might be "draw a separate curve for 'keep camera on if default state is on' and 'turn camera on if default state is off'", but there's more to say here for sure.

An elegant proof of Laplace’s rule of succession

I'm not conditioning on any configuration of points. I agree it's false for a given configuration of points, but that's not relevant here. Instead, I'm saying: number the intervals clockwise from 1 to n + 2, starting with the interval clockwise of Z. Since the n + 2 points were chosen uniformly at random, the interval numbered k1 is just as likely to have the new point as the interval numbered k2, for any k1 and k2. This is a probability over the entire space of outcomes, not for any fixed configuration.

(Or, as you put it, the average probability over all configurations of the probability of landing in a given interval is the same for all intervals. But that's needlessly complicated.)

Pseudorandomness contest: prizes, results, and analysis

For what it's worth, the top three finishers were three of the four most calibrated contestants! With this many strings, I think being intentionally overconfident as a bad strategy. (I agree it would make sense if there were like 10 or 20 strings.)

Load More