In Intelligence Explosion analysis draft: introduction, Luke Muehlhauser and Anna Salamon wrote

Norman Rasmussen's (1975) analysis of the safety of nuclear power plants, written before any nuclear accidents had occurred, correctly predicted several details of the Three Mile Island incident that previous experts had not (McGrayne 2011, 180).

I investigated this further a part of the project "Can we know what to do about AI?". The analysis that Muehlhauser and Salamon reference is WASH-1400, 'The Reactor Safety Study', which was produced for the Nuclear Regulatory Commission.

Upon investigating, I formed the impression that this case study little relevance to the question of whether we can know what to do about AI:

  • The issue was one that a lot of people were already concerned about.
  • The issue was highly domain specific.
  • Rather than there being a few salient predictions, there were a huge number of small predictions.

One way in which the situation is relevant is that risk of nuclear power plant accidents was not adequately addressed, but that's only one of many examples people not adequately addressing risks. 

A few details below.

  • The report predicted one core damage accident per 20,000 years of nuclear power plant operation.

    As a point of comparison, at the time of the Three Mile Island incident, only 500 years of nuclear power plant operation had occurred. This could be a fluke. The Chernobyl accident occurred only 6 years later. If the cause was the same, that strongly suggests that the Three Mile Island incident was not a fluke. 

    However, the report was based on reactor designs that didn't include the Three Mile Island type.
  • The report did discuss tidal waves as a potential cause for nuclear disaster, anticipating the recent disaster in Japan. But the report is 21 volumes long, and so this (weak) prediction could have been cherry picked retrospectively.
  • The report is considered to be obsolete, and its main lasting value seems to have been pioneering use of probabilistic risk assessment.
A summary of the situation seems to be: "People were concerned about nuclear power plants being dangerous. The Nuclear Regulatory Commission commissioned a report analyzing the risks. The report was really long, and didn't address key relevant factors. Other scientists thought that the report understated the risks. The report was quickly recognized to have major flaws and became obsolete."
New Comment
12 comments, sorted by Click to highlight new comments since:

Note that the quoted paragraph from an early draft of Intelligence Explosion: Evidence and Import didn't appear in the final version in part because we didn't have time to check it like Jonah has now done.

The Chernobyl accident occurred only 6 years later. If the cause was the same, that strongly suggests that the Three Mile Island incident was not a fluke.

The causes were totally different. Chernobyl was Soviet safety engineering, and then they were running an experiment to see how it performed in emergency conditions! wiki

"Tovarish Boss, that word 'experiment', you keep using that word. I do not think it means what you think it means."

Why not? They ran the experiment, and now we know how it performs in emergency conditions. We learned something valuable that day!

In three ways you learn wisdom. By reflection, which is the noblest. By imitation, which is the easiest. And by experience, which is the bitterest.

  • Confucius, I think.



The Chernobyl accident occurred only 6 years later. If the cause was the same, that strongly suggests that the Three Mile Island incident was not a fluke.

Chernobyl was uniquely fucked up, in an incredibly unsafe reactor - RBMKs didn't have containment buildings. There are definitely lessons to be learned from Chernobyl, but it doesn't demonstrate a common reactor failure mode.

Reactor safety is an interesting topic in itself. There seem to be some fairly plausible claims that reactors could be much safer and cheaper. But of course one concern with nuclear materials is always the potential for terrorism/proliferation/etc., and as a result a certain level of security and secrecy is probably unavoidable. Assuming that the enthusiasts are right that nuclear power is far more expensive and less safe than it could be, I wonder how much of the inefficiency is due to the direct and indirect effects of the security concerns (secrecy and security not only cost money in themselves, they are also inevitably used to cover up incompetence and corruption, and also inevitably engender mistrust which further reduces efficiency), and if so whether there really is any hope that the inefficiency could ever be dramatically reduced, given that the security concerns are not going to go away. Since nuclear power does have some notable advantages (like not producing greenhouse gasses), I do wish that there were some way the enthusiasts could turn out to be right, but the technology has been falling so far short of their hopes for such a long time now. Somehow things appear to be much harder than the enthusiasts expect.