As of an hour ago, I had not yet heard of the Centre for the Study of Existential Risk.
Luke announced it to Less Wrong, as The University of Cambridge announced it to the world, back in April:
CSER at Cambridge University joins the others.
Good people involved so far, but the expected output depends hugely on who they pick to run the thing.
CSER is scheduled to launch next year.
Here is a small selection of CSER press coverage from the last two days:
Here's an excerpt from one quite typical story appearing in tech-tabloid theregister.co.uk today:
Boffins at Cambridge University want to set up a new centre to determine what humankind will do when ultra-intelligent machines like the Terminator or HAL pose "extinction-level" risks to our species.
A philosopher, a scientist and a software engineer are proposing the creation of a Centre for the Study of Existential Risk (CSER) to analyse the ultimate risks to the future of mankind - including bio- and nanotech, extreme climate change, nuclear war and artificial intelligence.
Apart from the frequent portrayal of evil - or just misguidedly deadly - AI in science fiction, actual real scientists have also theorised that super-intelligent machines could be a danger to the human race.
Jaan Tallinn, the former software engineer who was one of the founders of Skype, has campaigned for serious discussion of the ethical and safety aspects of artificial general intelligence (AGI).
Tallinn has said that he sometimes feels he is more likely to die from an AI accident than from cancer or heart disease, CSER co-founder and philosopher Huw Price said.
The source for these stories appears to be a press release from the University of Cambridge:
In 1965, Irving John ‘Jack’ Good sat down and wrote a paper for New Scientist called Speculations concerning the first ultra-intelligent machine. Good, a Cambridge-trained mathematician, Bletchley Park cryptographer, pioneering computer scientist and friend of Alan Turing, wrote that in the near future an ultra-intelligent machine would be built. [...]
Three Four quick observations:
1: That's a lot of Terminator II photos.
2: FHI at Oxford and the Singularity Institute does not often get this kind of attention.
3: CSER doesn't appear to have published anything yet.
4: The number of people who have heard the term "existential risk" must have doubled a few times today.