x
QNR prospects are important for AI alignment research — LessWrong