It's more of a backdrop than a key focus, but the Culture series by Iain Banks features a civilisation where AI minds can monitor everything on their spaceships and habitats to near perfection. The only thing they choose not to monitor (usually), despite being able to is the thoughts of biological lifeforms.
" While building dams decreases the frequency of floods, damage per flood is afterward so much greater that average yearly damage increases. "
This is fascinating. Should we not be building dams? Could we say the same thing about fighting bushfires, since fighting them increases the amount of fuel they have available for next time?
Regarding the Spock probability reference, I've always imagined that TV shows and movies either take place in the parallel universe where very specific events happen to take place (e.g. the universe where the 'bad guys' miss the 'good guys' with all of their bullets despite being trained soldiers), or in the case of the Enterprise, the camera follows the adventures of the one ship that is super lucky. Perhaps the probability of survival really is 2.234 %, the Enterprise is just the 1 in 1,000 ship that keeps surviving (because who wants the camera to follow those other ships?).
"Why haven't more EAs signed up for a course on global security, or tried to understand how DARPA funds projects, or learned about third-world health?"
A very interesting point, and you've inspired me to take such a course. Does anyone have any recommendations for a good (and preferable reputable, given our credential addicted world) course relating to global security and health?