New to LessWrong?

New Comment
9 comments, sorted by Click to highlight new comments since: Today at 7:07 PM

If you link to the page instead of the image, it'll save everyone googling it for the mouseover.

Or, put the mouseover text in quotes after the http bit of the link like I did above.

Thanks - I did not have the original link.

The xkcd website has a search function (scroll down just past the comic).

I think today's xkcd is more relevant. Has anyone good figures on what's spent on things like cryonics research, or rational attempts to improve humanity's rationality, or maybe existential risk reduction, trying to save the future of the next several billion years and 100 million galaxies, to compare with the stuff on this chart?

Sorry, maybe I'm dense. How is this FAI relevant?

"It takes longer to develop value preserving AI technologies than to develop stuff that's cool but dangerous ("more fun than survival")"

Via the notion of a Great Filter, through existential risk generally. It's a bit of a stretch, for sure, but the link is there.

The Great Filter aspect is explicit. But that seems extremely tenuous. Rationalists should worry about the Great Filter whether or not it has anything to do with FAI.

Agreed. I was just saying that there is a link, and it's even reasonably salient within the context of this site. I make no claim that it is the most appropriate link to draw - and, indeed, would have recommended a different title.