I'm writing a series of discussion posts on how to purchase AI risk reduction (through donations to the Singularity Institute, anyway; other x-risk organizations will have to speak for themselves about their plans).
Each post outlines a concrete proposal, with cost estimates:
(For a quick primer on AI risk, see Facing the Singularity.)
Your link to Facing the Singularity and the link embedded in the picture both redirect to this page.
Both links work fine for me.
I fixed them shortly after Dorikka posted.
What I don't see people talking enough about is the obvious need for this:
large government funding (eg, in the US).
Our is an incredibly large and difficult mission -- to smoothly integrate humans, their qualia and values into the coming AI.
The government funding, of course, should not be directed by bureaucrats deciding on their own, but by, e.g., Singularity Institute and other Friendly AI, human-integration proponents .
I.e., government funding should be directed by a formidable Singularity preparation Political Action Committee
I've recently thrown together this site-in-progress: http://singularity-pac.com/ for that purpose.
However, it would be better to leverage existing, developed organizations such as the Singularity Institute, University, Hub, etc,
For my part, I'd like to raise awareness, and am leading development on singularity games. The proceeds, I'd like to fund a singularity PAC.
It might also be good to just directly ask eg the Gates Foundation or similar for the PAC money and get things rolling already.
What are you thoughts on this?
Two hesitations in re lots of cash:
1) What would SI do with the money? My sense is that the current management structure would be hard-pressed to absorb more than, say, twice what they currently have.
2) Government money comes with sometimes onerous obligations in terms of disclosure, transparency, etc etc. It may not be cost-effective. I don't know about foundation money but I'm not sure how hands-off the Gates people are.