Evolution is a bad analogy for AGI: inner alignment — LessWrong