[link] FLI's recommended project grants for AI safety research announced

Bostrom thought of FAI before Eliezer.

Do you have the link for that or at least the keywords? I assume Bostrom called it something else.

See this 1998 discussion between Eliezer and Nick. Some relevant quotes from the thread:

Nick: For example, if it is morally preferred that the people who are currently alive get the chance to survive into the postsingularity world, then we would have to take this desideratum into account when deciding when and how hard to push for the singularity.

Eliezer: Not at all! If that is really and truly and objectively the moral thing to do, then we can rely on the Post-Singularity Entities to be bound by the same reasoning. If the reasoning is wrong, the PSEs won'... (read more)

[link] FLI's recommended project grants for AI safety research announced

by Kaj_Sotala 1 min read1st Jul 201520 comments

17


http://futureoflife.org/misc/2015awardees

You may recognize several familiar names there, such as Paul Christiano, Benja Fallenstein, Katja Grace, Nick Bostrom, Anna Salamon, Jacob Steinhardt, Stuart Russell... and me. (the $20,000 for my project was the smallest grant that they gave out, but hey, I'm definitely not complaining. ^^)