Evan_Gaensbauer

Wiki Contributions

Comments

Zoe Curzi's Experience with Leverage Research

I previously have not been as aware that this is a pattern of how so many people have experienced responses to criticism from Geoff and Leverage in the past. 

Zoe Curzi's Experience with Leverage Research

Yeah, at this point, everyone coming together to sort this out together as a way of building a virtuous spiral of making speaking up feel safe enough that it doesn't even need to be a courageous thing to do or whatever is the kind of thing I think your comment also represents and what I was getting at. 

Zoe Curzi's Experience with Leverage Research

For what it's worth, my opinion is that you sharing your perspective is the opposite of making a mistake.

Zoe Curzi's Experience with Leverage Research

In the past, I've been someone who has found it difficult and costly to talk about Leverage and the dynamics around it, or organizations that are or have been affiliated with effective altruism, though the times I've spoken up I've done more than others. I would have done it more but the costs were that some of my friends in effective altruism interacted with me less, seemed to take me less seriously in general and discouraged me from speaking up more often again with what sometimes amounted to nothing more than peer pressure. 

That was a few years ago. For lots of reasons, it's easier, less costly, less risky and easier to not feel fear for me now. I don't know yet what I'll say regarding any or all of this related to Leverage because I don't have any sense of how I might be prompted or provoked to respond. Yet I expect I'll have more to say and towards what I might share as relevant I don't have any particular feelings about yet. I'm sensitive to how my statements might impact others but for myself personally I feel almost indifferent. 

Zoe Curzi's Experience with Leverage Research

Those making requests for others to come forward with facts in the interest of a long(er)-term common good could find norms that serves as assurance or insurance that someone will be protected against potential retaliation against their own reputation. I can't claim to know much about setting up effective norms for defending whistleblowers though.

Zoe Curzi's Experience with Leverage Research

I dipped my toe into openly commenting last week, and immediately received an email that made it more difficult to maintain anonymity - I was told "Geoff has previously speculated to me that you are 'throwaway', the author of the 2018 basic facts post".

Leverage Research hosted a virtual open house and AMA a couple weeks ago for their relaunch as a new kind of organization that has been percolating for the last couple years. I attended. One subject Geoff and I talked about was the debacle that was the article in The New York Times (NYT) on Scott Alexander from several months ago. I expressed my opinion that:

  1.  Scott Alexander could have managed his online presence much better than he did on and off for a number of years.
  2. Scott Alexander and the rationality community in general could have handled the situation much better than they did.
  3. Those are parts of this whole affair that too few in the rationality community have been willing to face, acknowledge or discuss about what can be learned from mistakes made.
  4. Nonetheless, NYT was the instigating party in whatever of the situation constituted a conflict between NYT, and Scott Alexander and his supporters, and NYT is the party that should be held more accountable and is more blameworthy if anyone wants to make it about blame.

Geoff nodded, mostly in agreement, and shared his own perspective on the matter that I won't share. Yet if Geoff considers NYT to have done one or more things wrong in that case, 

You yourself, Ryan, never made any mistake of posting your comments online in a way that might make it easier for someone else to de-anonymize you. If you made any mistake, it's that you didn't anticipate how adeptly Geoff would apparently infer or discern your identity. I expect why it wouldn't be so hard for Geoff to have figured it out it was you because you would have shared information about the internal activities at Leverage Research you are one of only a small number of people would have had access to. 

Yet that's not something you should not have had to anticipate. A presumption of good faith in a community or organization entails a common assumption that nobody would do that to their other peers. Whatever Geoff himself has been thinking about you as the author of those posts, he understands exactly the way in which to de-anonymize you or whoever would also be considered a serious violation of a commonly respected norm.

Based on how you wrote your comment, it seems that the email you received may have come across as intimidating. Obviously I don't expect you to disclose anything else about it, and would respect and understand if you don't, but it seems the email may have been meant to provide you with a well-intended warning. If so, there is also a chance Geoff had discerned that you were the account-holder for 'throwaway' (at least at the time of the posts in question) but hasn't even considered the possibility of de-anonymizing you, at least in more than a private setting. Yet either way, Geoff has begun responding in a way that if he were to act upon enough would only have become more disrespectful to you, your privacy and your anonymity. 

Of course, if it's not already obvious to anyone, neither am I someone who has an impersonal relationship with Leverage Research as an organization. I'm writing this comment with the anticipation that Geoff may read it himself or may not be comfortable with what I've disclosed above. Yet what I've shared was not from a particularly private conversation. It was during an AMA Leverage Research hosted that was open to the public. I've already explained above as well that in this comment I could have disclosed more, like what Geoff himself personally said, but I haven't. I mention that to also show that I am trying to come at this with good faith toward Geoff as well. 

During the Leverage AMA, I also asked a question that Geoff called the kind of 'hard-hitting journalistic' question he wanted more people to have asked. If that's something he respected during the AMA, I expect this comment is one he would be willing to accept being in public as well. 

Trust and The Small World Fallacy

Regarding problems related pseudoscientific quacks and cranks as a kind of example given, at this point it seems obvious that it needs to be taken for granted that there will be causal factors that, absent effective interventions, will induce large sections of society to embrace pseudo-scientific conspiracy theories. In other words, we should assume that if there is another pandemic in a decade or two, there will be more conspiracy theories. 

At that point in time, people will beware science again because they'll recall the conspiracies they believed in from the pandemic of 2019-2022 and how their misgivings about the science back then were never resolved either. As there is remaining skepticism of human-caused climate change now, in the world a couple decades from now after QAnon, don't be shocked if there are conspiracy theories about how catastrophic natural disasters are caused by weather control carried out by the same governments who tried convincing everyone decades ago that climate change was real. 

At present, we live in a world where the state of conspiracy theories in society has evolved to a point that it's insufficient to think about them in terms of how they were thought about even a decade ago. Conspiracy theories like how the moon landing was faked or even how 9/11 was a false flag attack don't seem to have the weight and staying power of conspiracy theories today. A decade from now, I expect COVID-19 conspiracy theories won't be the butt of jokes the same way those other conspiracy theories. Those other conspiracy theories didn't cause thousands of people to have their lives so needlessly shortened. I'm aware in the last few years there has been an increased investment in academia to research the nature of conspiracy theories as a way to combat them. 

It also doesn't help that we live in a time when some of the worst modern conspiracies or otherwise clandestine activities by governments are being confirmed. Lies early on in the pandemic on the scientific consensus about the effectiveness of masks to the common denial of any evidence the origin of COVID-19 could have been a lab outbreak are examples regarding only that specific case. From declassified documents in recent years proving CIA conspiracies from decades ago to stories breaking every couple years about the lengths governments have gone to cover up their illegal and clandestine activities, it's becoming harder in general to blame anyone for believing conspiracy theories.

Given such a low rate of crankery among scientists but how that alone has proven sufficient to give a veneer of scientific credibility to fuel the most extreme COVID-19 conspiracy theories, it seems like the main chokepoint won't be to neutralize the spread of the message at the original source that is that small percentage of cranks among experts. (By neutralize I don't mean anything like stopping their capacity to speak freely but counter them with other free speech in the form of a strategy composed of communication tactics of the most convincing known-down arguments as soon as any given crank is on the brink of becoming popular.) It's also self-evident that it's insufficient to undo the spread of a conspiracy theory once it's hit critical mass. 

Based on the few articles I've read on the research that's been done on this subject in the last few years, the chokepoint in the conspiracy theory pipeline to focus on to have the greatest impact may be to neutralize their viral spread as they first begin growing in popularity on social media. Again, with the cranks at the beginning of that pipeline, to stop the spread of so many conspiracy theories in the first place at their points of origin may prove too difficult. The best best may not to eliminate them in the first place but to minimize how much they spread once it becomes apparent. 

This entails anticipating different kinds of conspiracy theories before they happen, perhaps years in advance. In other words, for the most damaging kinds of conspiracy theories one can most easily imagine taking root among the populace in the years to come, the time to begin mitigating the impact they will have is now. 

Regarding the potential of prediction markets to combat this kind of problem, we could suggest that the prediction markets that are already related to the rationality community in some way begin facilitating predictions of future (new developments in) conspiracy theories starting now. 

Trust and The Small World Fallacy

The fact that many scientists are awful communicators who are lousy as telling stories is not a point against them. It means that they were more interested in figuring out the truth than figuring out how to win popularity contests.

This implies to me that there is a market for science communicators who in their careers specialize in winning popularity contests but do so to spread the message of scientific consensus in a way optimized to combat the most dangerous pseudoscience and misinformation/disinformation. It seemed like the Skeptics movement was trying to do the latter part if not the part about doing so by winning popularity contests at some point over a decade ago but it's been sidetracked by lots of others things since. 

For some science communicators to go about their craft in a way meant to win popularity contests may raise red flags about how it could backfire and those are potential problems worth thinking about. Yet I expect the case for doing so, in terms of cost-benefit analysis, is sufficient to justify considering this option. 

Trust and The Small World Fallacy

First, don't trust any source that consistently sides with one political party or one political ideology, because Politics is the Mind Killer.

One challenge with this is that it's harder to tell what the ideology in question is. If anti-vaxxers are pulled from among the populations of wingnuts on both the left and the right, I'm inclined to take lots of people whose views consistently side with one political party much more seriously not only on vaccines but on many other issues as well. 

Trust and The Small World Fallacy

It's quantitatively difficult to meet one million people, e.g., in terms of the amount of time it takes to accomplish that feat but how qualitatively hard it is makes it seem almost undoable but to me it's more imaginable. I've worked in customer service and sales jobs in multiple industries. 

I never kept count enough to know if I ever met one hundred people in one day but it could easily have been several dozen people everyday. I wouldn't be surprised if someone working the till at a McDonalds in Manhattan met over one hundred people on some days. Most people won't work a career like that for 27 years straight but enough do. I expect I could recall hundreds of people I interacted with only once but it would take a lot of effort to track all of that and it would still be the minority of them. 

Nonetheless, I thought it notable that to meet one million people in one's own lifetime is something common enough that it wouldn't surprise me if at least a few million people in the United States were people who had met over one million other individuals. 

Load More