I was recently thinking about the possibility that someone with a lot of influence might at some point try to damage LessWrong and the SIAI and what preemptive measures one could take to counter it.
If you believe that the SIAI does the most important work in the universe and if you believe that LessWrong serves the purpose of educating people to become more rational and subsequently understand the importance of trying to mitigate risks from AI, then you should care about public relations, you should try to communicate your honesty and well-intentioned motives as effectively as possible.
Public relations are very important because a good reputation is necessary to do the following:
- Making people read the Sequences.
- Raising money for the SIAI.
- Convincing people to take risks from AI seriously.
- Allowing the SIAI to influence other AGI researchers.
- Mitigating future opposition by politicians and other interest groups.
- Being no easy target for criticism.
An attack scenario
First one has to identify characteristics that could potentially be used to cast a damaging light on this community. Here the most obvious possibility seems to be to portray the SIAI, together with LessWrong, as a cult.
After some superficial examination an outsider might conclude the following about this community:
- Believing into heaven and hell in the form of a positive or negative Singularity.
- Discouraging skepticism while portraying their own standpoint as clear-cut.
- Encouraging to take ideas seriously.
- Encouraging and signaling strong cooperation and conformity.
- Evangelizing by scaring people and telling them to donate money.
- Social pressure by employing a reputation system with positive and negative incentive.
- Removing themselves from empirical criticism by framing everything as a prediction.
- Discrediting mainstream experts while placing themselves a level above them.
- Discouraging transparency and openness by referring to the dangers of AI research.
- Using scope insensitivity and high-risk to justify action, outweigh low probabilities and disregard opposing evidence.
Most of this might sound wrong to the well-read LessWrong reader. But how would those points be received by mediocre rationalists who don't know what you know, especially if eloquently summarized by a famous and respected person?
How one might counter such conclusions:
- Create an introductory guide to LessWrong.
- Explain why the context of the Sequences is important.
- Explain why LessWrong differs from mainstream skepticism.
- Enable and encourage outsiders to challenge and question the community before turning against it.
- Discourage the downvoting of people who have not yet read the Sequences.
- Don't expect people to read hundreds of posts without supporting evidence that it is worth it.
- Avoid jargon when talking to outsiders.
- Detach LessWrong from the SIAI by creating an additional platform to talk about related issues.
- Ask or pay independent experts to peer-review.
- Make the finances of the SIAI easily accessible.
- Openly explain why and for what the SIAI currently needs more money.
So what do you think needs improvement and what would you do about it?