Needing Better PR

by beriukay3 min read18th Aug 201119 comments


Personal Blog

I've been having a bit of a back-and-forth with a friend about what appears to be a charisma problem with the SIAI, and was hoping you lovely folks had thoughts on the matter. My friend was going through the Eliezer Q&A videos, specifically Question #7, "What's your advice for Less Wrong readers who want to help save the human race?" He typed up a transcript for Eliezer's answer, and went on to say:

Now, I freely admit that he is talking extemporaneously.  That he
maybe is giving point-by-point details, interpreting the question as a
laundry-list request of job opportunities, and that criticizing with a
call for brevity is easy as a response rather than a first try, but

All that aside, here's how a, ahem, human, might respond quickly and
clearly to the question:

Doing what you like and are most efficient at (for money) is the best
way to get resources to us if you support our cause.  Make money at
those things and send it to us if think we're worth it.


He went on to mention that he really likes Eliezer's writings, and that his issue rests with the verbal skills of SIAI's leadership, not with the quality of their works.

I replied:

On the one hand, it would be extremely beneficial for them to get some kind of propaganda minister. On the other, I think that would signal to nerds like us that they are corrupt money-whores. If that is the case, they are stuck being stilted nerds if they want to attract brains, and if they want money, they're stuck watering down their fan base with Dan Brown readers or something.

I also suggested a couple possible (though rather outlandish) ways to make an organization wildly popular. Specifically, to hire a marketing researcher like Frank Luntz to figure out what talking points would win the hearts and minds of the greatest number of people, or alternately to get major brand loyalties by having a cult figure like Steve Jobs representing the SIAI. Of course, I am stating this much more eloquently than I did in the email.

His reply deserves full posting here (with his permission, of course):

I disagree with your proposition that getting a competent marketing
firm involved would suddenly create a contradistinction with the
organization proper.

From everything I've seen/read, these people are nothing if not fully
aware of the compartmentalized world we live in.  That this enterprise
requires a particular something upon another something with these
other bits running in the background.  Hell, in a grossly simplistic
interpretation, accomplishing this nested complexity is the whole of
their aim.

What I would say, however, is that the idea that these people are even
aware of any sort of nerd brand loyalty is entirely off-base.  I don't
think these folks operate with that realization in mind.  I think if
you even brought up the concept, they'd look at you askew in the same
way as telling [mutual acquaintance] he was a geek.  "But...I'm...what?  I'm things."

No, I think the fact that they haven't invested in marketing may be
mainly due to money woes, but more likely revealing a fatal flaw in
their infrastructure, in that their intricate understanding of what
they need to do ultimately fails to absorb themselves in the mix.
Failing to see their own operation as needing the societal locomotive
powers to get the final job done.

If that fear is true, we're in an awful spot indeed.  Needing to be
rescued by people ignorant of how to rescue themselves.

Let's hope it's the money woes, then.  Or...hmm...maybe a vacuum to be
met by someone who believes in the cause and also possesses mild
wordcraft?  What fancy!

The question is now open. Does SIAI have a PR problem? If so, is it due to finances, lack of talent, or something else? Is there an Eternal September issue with watering down the brand (would you support the SIAI if they started investing heavily in advertising campaigns, or would you get a bit suspicious?)? Should they pay Frank Luntz to figure out what transhumanism terms work best with your average family? My friend and I are dying to know.


19 comments, sorted by Highlighting new comments since Today at 10:27 PM
New Comment

The Less Wrong Q&A is the most horribly designed... thing I've ever seen.

  • I don't understand why these are videos. They don't offer diagrams or anything useful to see. As a visual it's just an utterly boring face speaking to us. Eliezer's voice isn't particularly charismatic. So why can't he just type his responses, which would be both easier for him to edit, and for us to browse though?

  • Even if the video format served some purpose, the fact it's broken up in multiple videos, means that I can't even have the voice sound in the background while I'm eating or browsing articles or doing something else -- since it stops every couple minutes, and I then have to go back to click the next one (actually process is "go back to the page of the videos, click the link indicating the question, read the question, go back to the page of videos, click next video to start") Repeat the above 30 times. It's broken up in 30 videos. Thirty! What the hell.

  • Worse yet, the questions aren't even above the videos. They are just linked to. And the links don't even go to a page where all the questions are grouped together, they go to different pages for each question. Which means that if you're looking to find the video that answers a particular question, you need click 30 links, then check the video that interests you, then go through all links again to find the next interesting video.

In short: screw that. I got bored after the third video or so, and quit the whole deal. What process of rationality determined all the above design choices? I'm asking that seriously. Whoever designed the thing, please justify these damned choices to us.

I believe that was more of an Eliezer AMA that used the words "Less Wrong" in the title, rather than an LW "FAQ" or something.

since it stops every couple minutes, and I then have to go back to click the next one

Here's a playlist of the videos*. Start with one and the rest will auto-play. It's been a while since I watched them, but I think he repeats or at least summarizes the question at the beginning of each answer.

* Except for question 5/30, the one that was too big for YouTube.

I plan on transcribing all those video answers soon (within the next few days).

[This comment is no longer endorsed by its author]Reply

I'm going to go ahead and generalize from one example. While technically a 'success' since I am aware of Less Wrong, I would say I'm right on the border between the group of people who read LW and those who wouldn't. I share an interest in rationality (and somewhat less so with AI) but rest of the LW 'community knowledge' is very foreign to me. I never really read or watched science fiction (beyond maybe Sliders and the X-Files) and am only barely aware that fan-fiction exists . I've never had trouble socially and rarely feel awkward. The jargon is, at times, distancing but I think ultimately worth it. (The inherent trade-off between efficiency/precision and accessibility.) I imagine (with some prejudice) that LW is populated with people who refer to themselves as 'gamers' or who go LARPing or play Dungeon and Dragons (<- I think this is the first time I've ever typed the word 'Dungeon') or enjoy Monty Python. Things which, in my mind, are classified as 'nerd/geek/dork' things. (I've never quite understood the differences between these appellations.)

This, for me, is a very strange and alien world. Most (all?) groups have a cannon of knowledge/shared culture that you pretty much need to know to fit in. W.r.t. the mathematics and statistics, I'm right on board. Even a lot of the science I am up to speed on (or can become so). But I don't know all the other stuff, and my only incentive to learn it is to navigate LW world. I don't mean for any of this to be insulting, but I think if LW is to move forward, we are probably going to need to attract people like me, who also might have these stereotypes and prejudices. I don't know what the solution might be, but that's my perspective.

I'm definitely in the cluster of people you're describing (though my sci-fi background is pretty lacking), but I haven't noticed any particular references to them in most posts. I hope I'm not just totally blind...

Where are you seeing the cultural references? Comments? Do you mean the transhumanisty stuff?

That's a good question. Off the top of my head, I can't point to any specific post or comment. I could go through a look for examples, but that feels like clever arguing.

Maybe a better approach could be to just continue reading and make a note of whenever I see an example?

Like most stereotypes, this is something that just sorta "feels" true. But it also "feels" like it comes out more so in the comments than in the posts. This raises a couples issues. First, is it actually true? Second, if it isn't true, why does it "feel" true? Third, should we and could we do anything about it?

I doubt that a site that expects to entertain with college level math comprehension is ever going to ditch that image completely, but it should definitely be a goal.

Interesting. As an interested person for whom Less Wrong is a highly interesting, challenging and entertaining site but isn't exactly an 'insider', the following points:

First: 'Let's hope it's the money woes, then. Or...hmm...maybe a vacuum to be met by someone who believes in the cause and also possesses mild wordcraft? What fancy!'

This is probably just lighthearted but it's worth noting because Less Wrong clearly does have that. Several people here write well, and Eliezer writes very engagingly indeed, if in a slightly unpolished way. So if there's a PR problem it's not lack of talent with words.

Second: there is a bit of a (conscious? proud?) nerd bias. However, I think this is probably for a complicated set of reasons and can't be switched off. 1) group identity. Most sites like this identify by idolising people they like or (more often) constantly mocking those they don't. This one does through a bit of self-reference, which is probably better 2) condition of order: the fact that this blog exists and doesn't get political (or topical at all in a controversial way) is incredible: the entropic tendency of the net is towards flame wars, and the active intellectualised culture here may be needed to present that 3) 'feeling like home': this is a bit like (1) but is particularly interesting on this site. I've seen informal polls/anecdote suggesting a lot of aspergers on here, and a lot of generalised lack of social confdence. As such, this site might be one of the best 'these people get me!' social places for some members, which mean they're likely to emphasise their (percieved) common attributes.

As a 'being honest even though it makes me sound like a dick' aside which may be relevant to the PR of this group: I've seen various discussions on here of how to think/learn your way past social anxiety or lack of social skills in a systematic deliberate way. My intellectual mind thinks 'what a great idea, good for them'. But I am INTENSELY aware of my instinctive reaction of 'weirdos! you can't treat your social life like that! I wouldn't want to be trapped in a lift with one of these people! AWKWARD!' This despite the fact that I've been to a couple of meetups and found people very interesting and engaging.

Finally, there's an issue around whether the different parts of LW and SIAI can be peeled apart. As this interesting recent discussion thread notes, there are a lot of claims that newbies are presented with

critically, these are not only weird, but some of them have very obvious explanations from the external view. In particular, the core issues of AI and cryogenics immediately suggest a God-replacement/millenarian attitude and a rationalisation to escape fear of death respectively.

In particular, the core issues of AI and cryogenics immediately suggest a God-replacement/millenarian attitude and a rationalisation to escape fear of death respectively.

Perhaps higher profile refutations for these suspicions are in order.

The problem is that the suspicions don't necessarily need to be refuted... only explained. A super-intelligent AI is a bit of a god to human eyes, or at least a demi-god. I've said before that half the point of SIAI is to make sure that we create a god we like, and I wasn't really joking (I'm pretty sure I was quoting someone else as well). Likewise, I'm signed up for cryonics specifically because I don't like death, and would prefer to escape it if possible.

So I couldn't honestly refute either accusation, only admit to them and then either brush it off with "it's my crazy thing, we all have our pet crazy thing, right?" if I don't believe getting into the topic will be fruitful with that particular person, or to explain how this is different from superstition and try to reduce the inferential gap.

Only if those high profile refutations are a) quick b) non-reliant upon specialist knowledge c) seem honest.

I don't know a huge amount about either issue (which is revealing as an interested lurker and occasional participant here): but I think combining these is tough.

  • You could try to make it seem honest, but you need certain technical knowledge to really get them, and it's contentious technical knowledge in that most relevant scientists don't buy Less Wrong's take on either. So I might feel an argument seems convincing, but then remember that I can find pro or anti global warming arguments convincing if the person advancing them is far more informed on the scientific issues than me. So this would fail totally on (a) and (b): I'd have to feel I could rely on my own knowledge above the experts that disagree with LW and SIAI, and I have other things to do with my .

  • You can go for quick and easy, but the argument I'd expect here is the 'so much to lose from evil AI that it counterbalances low likelihood' or 'so much to gain from immortaility that it counterbalances low likelihood'. And both of those simply feel like cheats to most people: it's too like Pascal's Wager and feels like a trick that you can play by raising the stakes.

  • Finally, you can address the root of the suspicions by convincing people that you don't have the tendencies to be attracted by the idea of a greater mind, a father substitute that can solve the world's problems, that you don't look ahead to a golden future age or that you're intensely relaxed about your own mortality. But I don't know how you could do that. The last is particularly unbelievable for me.

Even before "we need more/better PR" comes the questions "what kind of PR do we need?" That's essentially what you are struggling with, and it's a question that I don't think LW or SIAI has answered yet.

[-][anonymous]10y 7

I think it boils down to "not seeming weird." For SIAI, I think it's it's pretty simple: we need to sound plausible so that we can get more donations! For LessWrong it's slightly more complicated, but it comes down to this: if people aren't deterred by the weirdness of the local beliefs and tropes, maybe they'll stick around and learn something useful.

Anissimov is the current PR "minister".

Forgive me if this is a stupid question (I am new here), but is there a community-endorsed primer on the Singularity that I can read somewhere ? The reason I ask is that, currently, I am not convinced that the Singularity is likely to happen, and I'm not even convinced that it could happen at all. Thus, even if the leadership of SIAI were silver-tongued marketing geniuses on par with Steve Jobs, I would still be unlikely to join them.

Don't get me wrong, I like the idea of the Singularity. I would love to live in a post-Singularity, friendly-AI world. Therefore, I am compelled to examine the idea as critically as I can, to compensate for my cognitive bias.

[-][anonymous]10y 2

Not exactly what you asked for, but some extremely interesting arguments about the feasibility of an intelligence explosion occurred in The Hanson-Yudkowsky AI-Foom Debate.

Thanks, it definitely looks interesting, I'll read it as soon as I'm able. Aaaaand there goes all my free time for the week :-)