Why was the AI Alignment community so unprepared for this moment?
> Our epistemic rationality has probably gotten way ahead of our instrumental rationality > > -Scott Alexander > > A Lesswrong Crypto autopsy This is a question post: Why was the AI Alignment community so unprepared for engaging with the wider world when the moment finally came? EDIT Based on...
Excellent post thank you for sharing. My comment is a bit of a hijack, but your related post linked at the top that led to this one doesn't seem to have a way to comment so I thought I'd ask here.
In that post you outline your problems grappling with the editorial decisions of the Penguin Great Ideas series (in addition to the misogyny itself).
Is there a reason you chose the Penguin series instead of the Great Books of the Western World curriculum?
My impression is that list is much less editorialized than the Penguin list and may at least solve your problem of "why did this 21st century white guy decide to cherry pick so much misogynistic content out of this vast corpus".
At least with the Britannic list, it was team of 20th century white guys applying a semi-objective inclusion matrix and they didn't (to my knowledge) trim or edit any of the individual works.