x
Small language models hallucinate knowing something's off. — LessWrong