Do we want too much from a potentially godlike AGI? — LessWrong