The Centre for the Study of Existential Risk (CSER) has recently held its first public lecture which can be found here:

 

Existential Risk: Surviving the 21st Century

 

The talk's blurb:

"In the coming century, the greatest threats to human survival may come from our own technological developments. However, if we can safely navigate the pitfalls, the benefits that technology promises are enormous. A philosopher, an astronomer, and an entrepreneur have come together to form the Centre for the Study of Existential Risk. The goal: to bring a fraction of humanity’s talents to bear on the task of ensuring our long-term survival. In this lecture, Huw Price, Martin Rees and Jaan Tallinn will outline humanity’s greatest challenge: surviving the 21st century."

From CSER's about page:

"An existential risk is one that threatens the existence of our entire species.  The Cambridge Centre for the Study of Existential Risk (CSER) — a joint initiative between a philosopher, a scientist, and a software entrepreneur — was founded on the conviction that these risks require a great deal more scientific investigation than they presently receive.  CSER is a multidisciplinary research centre dedicated to the study and mitigation of risks that could lead to human extinction.

Our goal is to steer a small fraction of Cambridge’s great intellectual resources, and of the reputation built on its past and present scientific pre-eminence, to the task of ensuring that our own species has a long-term future."

The philosopher, scientist and entrepreneur in question being Huw Price, Martin Rees and Jaan Tallinn respectively.

 

Incase you are looking for the talk that Jaan Tallinn referred to, I think that it is this.

New to LessWrong?

New Comment
3 comments, sorted by Click to highlight new comments since: Today at 8:46 AM

We've also redesigned and relaunched our website, with more information on our areas of interest and planned research: http://cser.org/

[-]gsgs10y00

so, what are the risks ? Is it secret ?