DPiepgrass

Worried that typical commenters at LW care way less than I expected about good epistemic practice. Hoping I'm wrong.

Software developer and EA with interests including programming language design, international auxiliary languages, rationalism, climate science and the psychology of its denial.

Looking for someone similar to myself to be my new best friend:

❖ Close friendship, preferably sharing a house ❖ Rationalist-appreciating epistemology; a love of accuracy and precision to the extent it is useful or important (but not excessively pedantic) ❖ Geeky, curious, and interested in improving the world ❖ Liberal/humanist values, such as a dislike of extreme inequality based on minor or irrelevant differences in starting points, and a like for ideas that may lead to solving such inequality. (OTOH, minor inequalities are certainly necessary and acceptable, and a high floor is clearly better than a low ceiling: an "equality" in which all are impoverished would be very bad) ❖ A love of freedom ❖ Utilitarian/consequentialist-leaning; preferably negative utilitarian ❖ High openness to experience: tolerance of ambiguity, low dogmatism, unconventionality, and again, intellectual curiosity ❖ I'm a nudist and would like someone who can participate at least sometimes ❖ Agnostic, atheist, or at least feeling doubts

Wiki Contributions

Comments

Speaking for myself: I don't prefer to be alone or tend to hide information about myself. Quite the opposite; I like to have company but rare is the company that likes to have me, and I like sharing, though it's rare that someone cares to hear it. It's true that I "try to be independent" and "form my own opinions", but I think that part of your paragraph is easy to overlook because it doesn't sound like what the word "avoidant" ought to mean. (And my philosophy is that people with good epistemics tend to reach similar conclusions, so our independence doesn't necessarily imply a tendency to end up alone in our own school of thought, let alone prefer it that way.)

Now if I were in Scott's position? I find social media enemies terrifying and would want to hide as much as possible from them. And Scott's desire for his name not to be broadcast? He's explained it as related to his profession, and I don't see why I should disbelieve that. Yet Scott also schedules regular meetups where strangers can come, which doesn't sound "avoidant". More broadly, labeling famous-ish people who talk frequently online as "avoidant" doesn't sound right.

Also, "schizoid" as in schizophrenia? By reputation, rationalists are more likely to be autistic, which tends not to co-occur with schizophrenia, and the ACX survey is correlated with this reputation. (Could say more but I think this suffices.)

Scott tried hard to avoid getting into the race/IQ controversy. Like, in the private email LGS shared, Scott states "I will appreciate if you NEVER TELL ANYONE I SAID THIS". Isn't this the opposite of "it's self-evidently good for the truth to be known"? And yes there's a SSC/ACX community too (not "rationalist" necessarily), but Metz wasn't talking about the community there.

My opinion as a rationalist is that I'd like the whole race/IQ issue to f**k off so we don't have to talk or think about it, but certain people like to misrepresent Scott and make unreasonable claims, which ticks me off, so I counterargue, just as I pushed a video by Shaun once when I thought somebody on ACX sounded a bit racist to me on the race/IQ topic.

Scott and myself are consequentialists. As such, it's not self-evidently good for the truth to be known. I think some taboos should be broached, but not "self-evidently" and often not by us. But if people start making BS arguments against people I like? I will call BS on that, even if doing so involves some discussion of the taboo topic. But I didn't wake up this morning having any interest in doing that.

Huh? Who defines racism as cognitive bias? I've never seen that before, so expecting Scott in particular to define it as such seems like special pleading.

What would your definition be, and why would it be better?

Scott endorses this definition:

Definition By Motives: An irrational feeling of hatred toward some race that causes someone to want to hurt or discriminate against them.

Setting aside that it says "irrational feeling" instead of "cognitive bias", how does this "tr[y] to define racism out of existence"?

I think about it differently. When Scott does not support an idea, but discusses or allows discussion of it, it's not "making space for ideas" as much as "making space for reasonable people who have ideas, even when they are wrong". And I think making space for people to be wrong sometimes is good, important and necessary. According to his official (but confusing IMO) rules, saying untrue things is a strike against you, but insufficient for a ban.

Also, strong upvote because I can't imagine why this question should score negatively.

Scott had every opportunity to say "actually, I disagree with Murray about..." but he didn't, because he agrees with Murray

[citation needed] for those last four words. In the paragraph before the one frankybegs quoted, Scott said:

Some people wrote me to complain that I handled this in a cowardly way - I showed that the specific thing the journalist quoted wasn’t a reference to The Bell Curve, but I never answered the broader question of what I thought of the book. They demanded I come out and give my opinion openly. Well, the most direct answer is that I've never read it.

Having never read The Bell Curve, it would be uncharacteristic of him to say "I disagree with Murray about [things in The Bell Curve]", don't you think?

Strong disagree based on the "evidence" you posted for this elsewhere in this thread. It consists one-half of some dude on Twitter asserting that "Scott is a racist eugenics supporter" and retweeting other people's inflammatory rewordings of Scott, and one-half of private email from Scott saying things like

HBD is probably partially correct or at least very non-provably not-correct

It seems gratuitous for you to argue the point with such biased commentary. And what Scott actually says sounds like his judgement of ... I'm not quite sure what, since HBD is left without a definition, but it sounds a lot like the evidence he mentioned years later from 

(yes, I found the links I couldn't find earlier thanks to a quote by frankybegs from this post which―I was mistaken!―does mention Murray and The Bell Curve because he is responding to Cade Metz and other critics).

This sounds like his usual "learn to love scientific consensus" stance, but it appears you refuse to acknowledge a difference between Scott privately deferring to expert opinion, on one hand, and having "Charles Murray posters on his bedroom wall".

Almost the sum total of my knowledge of Murray's book comes from Shaun's rebuttal of it, which sounded quite reasonable to me. But Shaun argues that specific people are biased and incorrect, such as Richard Lynn and (duh) Charles Murray. Not only does Scott never cite these people, what he said about The Bell Curve was "I never read it". And why should he? Murray isn't even a geneticist!

So it seems the secret evidence matches the public evidence, does not show that "Scott thinks very highly of Murray", doesn't show that he ever did, doesn't show that he is "aligned" with Murray etc. How can Scott be a Murray fanboy without even reading Murray?

You saw this before:

I can't find any expert surveys giving the expected result that they all agree this is dumb and definitely 100% environment and we can move on (I'd be very relieved if anybody could find those, or if they could explain why the ones I found were fake studies or fake experts or a biased sample, or explain how I'm misreading them or that they otherwise shouldn't be trusted. If you have thoughts on this, please send me an email). I've vacillated back and forth on how to think about this question so many times, and right now my personal probability estimate is "I am still freaking out about this, go away go away go away". And I understand I have at least two potentially irresolveable biases on this question: one, I'm a white person in a country with a long history of promoting white supremacy; and two, if I lean in favor then everyone will hate me, and use it as a bludgeon against anyone I have ever associated with, and I will die alone in a ditch and maybe deserve it.

You may just assume Scott is lying (or as you put it, "giving a maximally positive spin on his own beliefs"), but again I think you are conflating. To suppose experts in a field have expertise in that field isn't merely different from "aligning oneself" with a divisive conservative political scientist whose book one has never read ― it's really obviously different how are you not getting this??

he definitely thinks this

He definitely thinks what, exactly?

Anyway, the situation is like: X is writing a summary about author Y who has written 100 books, but pretty much ignores all those books in favor of digging up some dirt on what Y thinks about a political topic Z that Y almost never discusses (and then instead of actually mentioning any of that dirt, X says Y "aligned himself" with a famously controversial author on Z.)

It's really weird to go HOW DARE YOU when someone says something you know is true about you, and I was always unnerved by this reaction from Scott's defenders

It's not true though. Perhaps what he believes is similar to what Murray believes, but he did not "align himself" with Murray on race/IQ. Like, if an author in Alabama reads the scientific literature and quietly comes to a conclusion that humans cause global warming, it's wrong for the Alabama News to describe this as "author has a popular blog, and he has aligned himself with Al Gore and Greta Thunberg!" (which would tend to encourage Alabama folks to get out their pitchforks 😉) (Edit: to be clear, I've read SSC/ACX for years and the one and only time I saw Scott discuss race+IQ, he linked to two scientific papers, didn't mention Murray/Bell Curve, and I don't think it was the main focus of the post―which makes it hard to find it again.)

DPiepgrass1mo1211

I agree, except for the last statement. I've found that talking to certain people with bad epistemology about epistemic concepts will, instead of teaching them concepts, teach them a rhetorical trick that (soon afterward) they will try to use against you as a "gotcha" (related)... as a result of them having a soldier mindset and knowing you have a different political opinion.

While I expect most of them won't ever mimic rationalists well, (i) mimicry per se doesn't seem important and (ii) I think there are a small fraction of people (tho not Metz) who do end up fostering a "rationalist skin" ― they talk like rationalists, but seem to be in it mostly for gotchas, snipes and sophistry.

I'm thinking it's not Metz's job to critique Scott, nor did his article admit to being a critique, but also that that's a strawman; Metz didn't publish the name "in order to" critique his ideas. He probably published it because he doesn't like the guy.

Why doesn't he like Scott? I wonder if Metz would've answered that question if asked. I doubt it: he wrote "[Alexander] aligned himself with Charles Murray, who proposed a link between race and I.Q." even though Scott did not align himself with Murray about race/IQ, nor is Murray a friend of his, nor does Alexander promote Murray, nor is race/IQ even 0.1% of what Scott/SSC/rationalism is about―yet Metz defends his misleading statement and won't acknowledge it's misleading. If he had defensible reasons to dislike Scott that he was willing to say out loud, why did he instead resort to tactics like that?

(Edit: I don't read/follow Metz at all, so I'll point to Gwern's comment for more insight)

there are certain realities about what happens when you talk about politics.

Says the guy who often wades into politics and often takes politically-charged stances on LW/EAF. You seem to be correct, it's just sad that the topic you are correct about is the LessWrong community.

Part of what the sequences are about is to care about reality and you prefer to be in denial of it

How charitable of you. I was misinformed: I thought rationalists were (generally) not mind-killed. And like any good rationalist, I've updated on this surprising new evidence. (I still think many are not, but navigating such diversity is very challenging.)

Then you are wrong.

Almost every interaction I've ever had with you has been unpleasant. I've had plenty of pleasant interactions, so I'm confident about which one of us this is a property of, and you can imagine how much I believe you. Besides which, it's implausible that you remember your thought processes in each of the hundred-ish comments you've made in the last year. For me to be wrong means you recollected the thought process that went into a one-sentence snipe, as in "oh yeah I remember that comment, that's the one where I did think about what he was trying to communicate and how he could have done better, but I was busy that day and had to leave a one-sentence snipe instead."

Talking about it does not trigger people's tribal senses the same way as talking about contemporary political conflicts. 

Odd but true. Good point.

there are also plenty of things that happened in the 20th century that were driven by bad epistemics

No doubt, and there might even be many that are clear-cut and no longer political for most people. But there are no such events I am knowledgeable about.

You don't want people to focus on big consequences

Yes, I do. I want people to sense the big consequences, deeply and viscerally, in order to generate motivation. Still, a more academic reformulation may also be valuable.

Load More