“Theosophists have guessed at the awesome grandeur of the cosmic cycle wherein our world and human race form transient incidents. They have hinted at strange survival in terms which would freeze the blood if not masked by a bland optimism.”

– H.P. Lovecraft on transhumanism



Just thought I'd write a quick post to sum up the H+ UK conference and subsequent LW meetup attended by myself, ciphergoth, JulianMorrison, Leon and a few other LW lurkers. My thanks to David Wood for organizing the conference, and Anders Sandberg for putting me up/putting up with me the night before.

I made a poster giving a quick introduction to “Cognitive Bias and Futurism”, which I will put up on my website shortly. The LW crowd met up as advertised – we discussed the potential value of spreading the rationality message to the H+ community.1

One idea was for someone (possibly me) to do a talk at UKH+ on “Rationality and Futurism”, and to get the UK transhumanist crowd involved and on board somewhat more. The NYC Less Wrong guys seem to be doing remarkably well with a meetup group, about a billion members, a group house (?) – do you have any advice for us?

The talks were interesting and provocative – of particular note were:

  • Aubrey De Grey's talk which was far from his usual fare. He tailored his talk to suit the H+ audience who are more familiar with SENS, and spoke about the pace of recent advances in induced pluripotency in stem cells, progress in migrating mitochondrial DNA from the mitochondria to the cell nucleus, and how the mainstream media can blow a minor paper with a sexy title to undeserved levels of fame whilst passing over much more important but “technical” sounding work. I asked what probability he assigned to his SENS program working, but he did not give an estimate. I think that rationalists could potentially help the life-extension movement by publishing an independent, critical review of the probability of the SENS program succeeding, perhaps in an economics journal.
  • Nick Bostrom added the definition of “capability potential” and “axiological potential” to his usual existential risk meme.
  • Other talks were interesting, but showed a lack of appreciation of rationalist methods. For example, though many talks made predictions about the future, only Bostrom's talk gave any probabilities. Other talks used version 1.0 epistemology – phrases like “I think that X will happen, not Y”, rather than “I assign more probability (k%) to X than most people do”. Max More was particularly guilty of this in his talk on “Singularity Skepticism” – though only because his talk attempted to answer a much harder question than most of the other talks. (E.g. transhumanist art, architectural style, etc)

 



1: Particularly after hearing a man say that he wouldn't sign up for cryonics because it “might not work”. We asked him for his probability estimate that it would work (20%), and then asked him what probability he would need to have estimated for him to think it would be worth paying for (40%) – which he then admitted he had made up on the spot as “an arbitrary number”. Oh, and seeing a poster claiming to have solved the problem of defining an objective morality, which I may or may not upload.

New to LessWrong?

New Comment
5 comments, sorted by Click to highlight new comments since: Today at 8:16 PM

I was the main organizer for the NYC LW/OB group until I moved out to the Bay Area a few weeks ago. From my experience, if you want to encourage people to get together with some frequency you need to make doing so require as little effort and coordination as possible. The way I did it was as follows:

We started a google group that everyone interested in the meetups signed up for, so that we could contact each other easily.

I picked an easily accessible location and two times per month, (second Saturdays at 11am and 4th Tuesdays at 6pm) on which meetups would always occur. I promised to show up at both times every month for at least two hours regardless of whether or not anyone else showed up. I figured the worst that could happen was that I'd have two hours of peace and quiet to read or get some work done and that if at least one person showed up we'd almost certainly have a great time.

We've been doing that for about 9 months and I've never been left alone. In fact, we found that twice a month wasn't enough and started meeting every week a few months ago.

At the moment, only one meetup per month is announced to the "public" through the meetup.com group (so that we don't have to explain all of the basics to new people every meeting), one is for general unfocused discussion and two are rationality-themed game nights (such as poker training).

You should probably set up the google/meetup.com group first and poll people on what times work best for them and what kinds of activities they are most interested in and then take it form there.

I wish you the best of luck, and I'd be happy to answer any other questions you might have.

[-]Roko14y20

Thanks, Jasen. What is the turnout at these meetups? How did you find it grew?

Max makes many of the same points I make here:

http://alife.co.uk/essays/the_singularity_is_nonsense/

We do have some disagreements, though. For example, Max expresses scepticism about an intelligence explosion in part 3 - whereas I take that for granted.

Thanks for this writeup!

One problem was that we ended up having to switch pubs at the last minute. If you're thinking of coming to a LessWrong meetup I'm organising, it would be great if you would email me with your mobile number in advance, so I can let you know about any changes. My email address is paul at ciphergoth dot org.

We ended up going to a Wetherspoon's pub called The Shakespeare's Head, which was perfectly pleasant, and since six of us now know where it is it's a front-runner to be the venue for the 2010-06-06 meetup.