LESSWRONG
LW

DirectedEvolution
11779Ω-2614220880
Message
Dialogue
Subscribe

Pandemic Prediction Checklist: H5N1

Pandemic Prediction Checklist: Monkeypox

 

Correlation may imply some sort of causal link.

For guessing its direction, simple models help you think.

Controlled experiments, if they are well beyond the brink

Of .05 significance will make your unknowns shrink.

 

Replications show there's something new under the sun.

Did one cause the other? Did the other cause the one?

Are they both controlled by what has already begun?

Or was it their coincidence that caused it to be done?

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
5AllAmericanBreakfast's Shortform
5y
392
No wikitag contributions to display.
But Have They Engaged With The Arguments? [Linkpost]
DirectedEvolution3d51

The majority of those who best know the arguments for and against thinking that a given social movement is the world's most important cause... are presumably members of that social movement.

Knowing the arguments for and against X being the World's Most Important Cause (WMIC) is fully compatible with concluding X is not the WMIC, even a priori. And deeply engaging with arguments about any X being the WMIC is an unusual activity, characteristic of Effective Altruism. If you do that activity a lot, then it's likely you know the arguments for and against many causes, making it unlikely you're a member of all causes for which you know the arguments for and against.

If they decide to hear out a first round of arguments but don't find them compelling enough, they drop out of the process.

The simple hurdle model presented by OP implies that there is tremendous leverage in coming up with just one more true argument against a flawed position. Presented with it, a substantial number of the small remaining number of true believers in the flawed position will accept it and change their mind. My perception is that this is not at all what we typically assume when arguing with a true believer in some minority position -- we expect that they are especially resistant to changing their mind.

I think a commonsense point of view is that true believers in flawed positions got there under the influence of systematic biases that dramatically increased the likelihood that they would adopt a flawed view. Belief in a range of conspiracy theories and pseudoscientific views appear to be correlated both in social groups and within individuals, which would support the hypothesis of systematic biases accounting for the existence of minority groups holding a common flawed belief. Possibly, their numbers are increased by a few unlucky reasoners who are relatively unbiased but made a series of unfortunate reasoning mistakes, and will hopefully see the light when presented with the next accurate argument.

Reply
Futility Illusions
DirectedEvolution13d00

This easily leads to the impression that “retention is bad everywhere”, because all people hear from other group organizers are complaints about low retention. But this not only involves some reporting bias – groups with better retention rates usually just don’t talk about it much, as it’s not a problem for them.

Implied narrative is that we don't hear about successful groups, which is obviously false. Alternative model: most groups, products, etc just don't have much demand/have too much competition. Group founders don't want to just achieve "growth," they want a very specific kind of growth that fits their vision for the group they set out to found. What makes you think there's typically a way to keep the failing group the same on the important traits while improving retention? And if such strategies exist in theory, why do you think that any given group founder should expect they can put them into practice?

Reply
Futility Illusions
DirectedEvolution13d-10

This can particularly make sense in cases where we have already invested a lot of effort into something. But if we haven’t – as is the case to varying degrees in these examples – then it would, typically, be really surprising if we just ended up close to the optimum by default.

Who is "we?" You, personally? All society? Your ancestral lineage going back to LUCA? Selection effects, cultural transmission of knowledge, and instinct all provide ways activities can be optimized without conscious personal effort. In many domains, assuming approximate optimality by default should absolutely be your baseline assumption. And then there's the metalevel to consider, on which your default assumptions about approximate optimality for any domain you might consider are also optimized by default. Perhaps your prior should be that your optimality assumptions are roughly optimal, then reason from that starting point! If not, why not?

Reply
Futility Illusions
DirectedEvolution13d20

Immutability: Either, the given property is truly entirely fixed, and cannot be changed at all.

Big difference between "cannot be changed at all" and "the distribution is fixed, but with day to day variation."

Reply
Banning Said Achmiz (and broader thoughts on moderation)
DirectedEvolution14d910

What's your motivation to spend a lot of effort to write up your arguments? If you're right, both the post and your efforts to debunk it are quickly forgotten, but if you're wrong, then the post remains standing/popular/upvoted and your embarrassing comment is left for everyone to see

If you didn't have the motivation to write your arguments, why did you waste your time reading the post? If you debunk the author's post, they're unlikely to forget it. If you debunk numerous posts, then you may acquire a reputation. If you debunk a popular post, then many people see the debunking. You've also spared yourself the labor of debunking future posts based on the initial flawed idea. The reward for delivering valid and empathetic criticism is cultivating a community of truth seekers in which you and others may be willing and able to participate. Do you lack that vision? Do you have that outled elsewhere? Do you not care about developing community? Do you simply have better things to do and want to freeload on the community that others build?

Writing up a quick "clarifying" question makes more sense from a status/strategic perspective, but I rarely do even that nowadays because I have so little to gain from it, and a lot to lose including my time (including expected time to handle any back and forth)

You took the time to read the post, but you won't write a "quick" clarifying question because you're worried about wasting your time, and you think you have little to gain by understanding the content, so you're depending on commenters like Said to do the job? If you have the time for just the initial question but not the back and forth, just write the first question and read the response. It takes little more time to put a brief friendly signal at the top of the comment than to leave it out. One may also practice writing in a non-contemptuous manner until it comes naturally, learn to skim posts and read only those clearly likely to be worth responding to. It is possible to deliver low-effort criticism without being a flagrant asshole about it.

If you get rid of people like Said or otherwise discourage low-effort criticism, you'll just get less criticism not better criticism.

How do you know? Have you gathered data on this topic? Have you moderated a community? Have you observed the course of a substantial number of comparable moderation decisions in the past? What exactly is your model of the overall community reaction to such moderation decisions that leads you to this conclusion?

Low-effort and even "unproductive" criticism is an important signal

A signal of what? Important to whom? Are you really interested in what a low-effort troll would have to say in response to what you happen to write and post online?

For example I think any posts by Eliezer will always attract plenty of criticisms due to the status rewards available if someone pointed out a real flaw.

If posts worth criticicizing, due to their intellectual quality and interest of the community, will receive their due criticism, then why can't weak and uninteresting posts can be ignored or engaged with by a charitable volunteer as a teacher might respond to a student in order to develop their capabilities? Targeting weak and forgettable posts for unwarranted criticism increases their prominence in a quite mechanistic fashion due to the high-variance upvotes, the intrigue of seeing why a comment was strongly downvoted, the fact that the LessWrong homepage boosts new and highly upvoted comments, and because the author may feel attacked and respond in an endless comment chain. There are selection effects on who stays in the community under these conditions. Solve for the equilibrium.

Reply
Church Planting: When Venture Capital Finds Jesus
[+]DirectedEvolution19d-123
Generalized Coming Out Of The Closet
DirectedEvolution23d52

This was an interesting read. Thank you for sharing your information. I personally wouldn't have been worried about why you have all this information, as it simply reads to me like an essay by somebody who's done research on an important topic.

The way your comment is written, the underlying narrative is that the only risk to consider with coming out online is turning a reader into a stalker due to that post in isolation, and that this risk is insubstantial. I think your argument is plausible for that specific risk. However, I am considering a wider variety of risks, including:

  • A person with a larger body of online writings who comes out.
  • A person who publishes their post to an audience or on a platform that's more likely to generate unwanted attention.
  • A person who is being investigated and/or stalked online as a result of real-world activities, such as applying for a job or pursuing political office.
  • A person for whom coming out will create friction in pursuing real-world activities in the future that oughtweighs the benefits they gain from coming out online.
  • A potential sense of paranoia about having disclosed potentially embarrassing information online.
  • The reputational and perceptions risk for a controversial community of its high-status members advocating that its low-status members post embarrassing information online.

LessWrong and the rationalist community already has a controversial reputation and is often accused of being a cult. It is an online platform, which is capable of spawning hate-readers like r/sneerclub. It is also a real-world community with a notorious history of deeply exploitative behavior and what seems to be a higher-than-average fraction of self-identified participants or ex-participants with deep mental health issues. It has a history of scandal and has generated a substantial amount of negative media coveage, considering its small size.

Considering all of this, I believe that the LessWrong and rationalist community is not well-positioned to reduce the risks of coming out beyond the normal level available in the wider culture. In other words, if there's an X that you don't feel safe to come out about, then I don't think LessWrong/rationalism in its current form is capable of helping you feel more safe about coming out about X. This is a heuristic, not a general rule, and if other people do feel LessWrong/rationalism helps them come out about their personal characteristics in a way other communities don't, I'm interested to hear it. But for this reason, I think that high-status LessWrong members should not be encouraging others to come out in public about more things with little regard for risks. That seems irresponsible and likely to result in damage both to members and to the community as a whole.

I do think that it would be beneficial if LessWrong/rationalism worked to think through this problem and become the sort of community that is capable of effectively supporting its members in "coming out" in a way that improved the community, its relations with the rest of the world, and the health and wellbeing of its members. Basically, I like the vision of "generalized coming out," but I don't like the strategy John proposes in his OP for getting there for LessWrong/rationalism.

Reply11
Generalized Coming Out Of The Closet
DirectedEvolution23d194

My core belief on this topic is that coming out is, in fact, a risky practice in America and world wide. It’s risky to come out about your kinks, your sexual gender orientation, your political beliefs, and your historical affiliations with groups or types of groups that have controversial reputations.

Coming out can be net beneficial under controlled circumstances. Generally, it is better to have a world in which people have the ability to achieve those benefits. That starts by being aware of and working to mitigate those risks. The queer community is an excellent example of a group of people who’ve done that and reaped the rewards.

My central problem with your OP and responses here is that you seem to be rejecting the need for consideration or mitigation of those risks. This flies in the face of the historical experience of queers, apostates, atheists, political, radicals, and other groups who’ve come out in ways that failed to control those risks and suffered for it. By encouraging people to just come out without considering or taking steps to mitigate risks, you encourage them to make themselves vulnerable in ways that may make them more dependent on the rationalist community as the place where you’re seeking to enact this attitude toward coming out.

In my view, there is an enormous volume of historical experience of a wide variety of groups that backs up the profound risks of coming out. These risks include ostracism, exclusion from job opportunities, public humiliation, and physical violence. Again, those risks can be mitigated and the reward for doing so are great. But flat out denying those risks strikes me as foolish when it’s done by an individual, and cult-inducing when it becomes a community norm.

Reply
Generalized Coming Out Of The Closet
DirectedEvolution24d30

Dude, that doesn't make any sense. People making their secrets public is the opposite of what blackmailers want.

This logic is so bad that I think it should be a flag to you that you are clearly motivatedly reasoning real hard here.

I want to flag that you are unilaterally escalating this conversation rhetorically, and making accusations about my psychological state that I really do not appreciate. Let's keep it civil and focused on the object-level topic.

Literally, on a surface level, you are correct that publicly posting dirty secrets online pre-empts some blackmail threats, although it still exposes the poster to the risk that those posts will be distorted or amplified in ways they didn't intend.

Functionally, my argument is plausible and logically sound that the outcome of blackmail and publicly posting secrets can be very similar. Both practices can result in an individual being trapped in the group/cult by the perception that their action (handing over blackmail threats or posting embarrassing secrets online) has now cut off or created friction for pathways to careers and relationships outside the community. 

Furthermore, if the group acquires an even stronger perception of being a cult than it has as a result of adopting the practice of encouraging members to post dirty secrets about themselves online, then that perception will further taint the reputation of and isolate its members. These sorts of ideas can be a way of gradually converting a formerly healthy community into a cult.

Reply
Generalized Coming Out Of The Closet
DirectedEvolution24d85

You are not considering how the manner in which you gain fame constrains the options for your ambition. If your personal sexual details are online, that may constrain your options for political office, executive and academic leadership, for example.

Being hated is rarely helpful for ambition. It’s a consequence of pursuing ambition - making some people unhappy to make others happy. If you do stuff that garners ambition-undermining hate without giving you a greater base of ambition-serving support, then it’s not helpful for this goal.  I continue to not see the case for how posting sexual kinks and other taboo info online will advance people’s ambitions for the vast majority of people and ambitions.

Reply
Load More
19Elaborative reading
9d
0
36A brief perspective from an IMO coordinator
1mo
7
41Estimating the benefits of a new flu drug (BXM)
8mo
2
21Contra Contra the Social Model of Disability
2y
22
12Compression of morbidity
2y
0
56Aging and the geroscience hypothesis
2y
14
19Popularizing vibes vs. models
2y
0
6Commentless downvoting is not a good way to fight infohazards
2y
9
16Request for feedback - infohazards in testing LLMs for causal reasoning?
Q
2y
Q
0
10Is the 10% Giving What We Can Pledge Core to EA's Reputation?
2y
1
Load More