Sorted by New


HLAI 2018 Field Report

Great way how to visualize the risks of unaligned AGI is the alien life form in the film Life

It starts as a seed entity, quickly adapts, learns new tricks, gets bigger and stronger, ruthlessly oblivious to human value system. Watch it, and instead of the alien imagine child AGI.

HLAI 2018 Field Report

Hi Gordon!

thanks for writing this. I am glad you enjoyed HLAI 2018.

I agree, many AI/AGI researchers partially or completely ignore AI/AGI safety. But I have noticed a trend in the past years: it's possible to "turn" these people and make them take safety more seriously.

Usually the reason to their "safety ignorance" is just insufficient insight, not spending enough time on this topic. Once they learn more, they quickly see how thing can go wrong. Of course, not everyone.

Hope this helped.