The Future of Humanity Institute* recently secured funding for a new Research Collaboration with Amlin Insurance focusing on systemic risks associated with risk modelling. We're looking for someone with an academic background or interests and management/organisational abilities to coordinate and develop this project and area of research.

Who we need

This is a unique opportunity to build a world-leading research programme. We’re looking for someone who can not only manage this project, but who also has the drive and initiative to find new sources of funding, network with leading experts, and design future plans for the project. We’re also looking for someone who understands and is motivated by the aims of the FHI; the post-holder will have the opportunity to contribute across the board to FHI projects, and may be a crucial part of the FHI’s success going forward.

It’s a two year position, but there will be the possibility of extension depending on the success of the project and the acquisition of further funding. We can sponsor a visa. All the details can be found here.

Why can you make a big difference in this role?

I’ve spoken to 80,000 hours in the past about the impact a talented person can have in academic project management; this Nature article also talks about the importance of this area of work. MIRI's recent successes are also in part due to the work of some excellent people with the right mix of research understanding and organisation-running ability.

While this is not a research post, your work will increase the success and impact of research done by each one of a team of top-tier academics, and will bring yet more high-quality researchers into the most important fields. This makes this position a way to achieve a huge amount of accumulative good.  With a successful funding, development and media strategy, you can contribute to shaping the fields that the FHI is leading the world in.

More on the project

Systemic risks concern the stability of an entire market, and are of great importance to managing large-scale risk. The very methods used to model these phenomena can themselves be a source of systemic risk, especially when they embody hidden assumptions that may not remain reliable in a fast-changing world. The project will focus on gaining a better understanding of systemic risks, particularly as they apply to catastropic risk modelling, and ways to avoid or mitigate such risks. Subtopics of interest are likely to include:

  • Ways in which individually rational agents can misbehave when they become part of a larger network.
  • Decision making under uncertainty.
  • Cognitive biases that emerge when dealing with large risks, and when using the information pipelines used to model catastrophic events.
  • How to model potential existential risks.

The current project research team are: Anders Sandberg, Stuart Armstrong and Nick Beckstead.

Closing date is 19th July.

For questions about the position please email me at sean.oheigeartaigh@philosophy.ox.ac.uk; Stuart Armstrong should also be able to answer questions. I'll try to answer as many as queries as I can, but I apologise in advance if I don't get to everyone - workload at present is very heavy. We'd be very grateful for any help in spreading the word to good people who might be interested. Thank you!

 

*The Future of Humanity Institute

The Future of Humanity Institute (Oxford) is a world-leading research centre looking at big-picture questions for human civilization.  With the tools of mathematics, philosophy, and science, we explore the risks and opportunities that will arise from technological change, weigh ethical dilemmas, and evaluate global priorities.  Our goal is to clarify the choices that will shape humanity’s long-term future.

New to LessWrong?

New Comment