The human factor
When Chernobyl's Unit 4 exploded in 1986, I was sitting in Scotland. Soon, I was reading about radioactive sheep; lamb was off the menu, and hundreds of upland farms were coping with contamination from the fallout of the Soviet nuclear-power plant.
Back then, British scientists were predicting that restrictions on the sale and slaughter of sheep would last only a few months. Now, almost two decades later, contamination lingers and some restrictions have been extended for another decade.
The disaster's half-life is proving longer than anticipated.
But perhaps the greater fallout has been the erosion of trust. Things nuclear have become suspect in the public eye. And what the authorities say about things nuclear is equally under suspicion.
The public might not have it all wrong. In the early days after Chernobyl, secrecy, confusion, and misinformation fed people's fears. Some of that was standard Soviet operating procedure. But even in the US, when Pennsylvania's Three Mile Island reactor core partially melted down in 1979, obfuscation ruled. When the dust settled, Metropolitan Edison was indicted on 11 counts and pled guilty to one for violating NRC regulations.
In the thick of a crisis, facts can be scarce, and human error often a central player. The Chernobyl technicians were running a test when they decided to turn off all safety mechanisms. The rest is history.
For good nuclear design, suggests one nuclear-materials consultant in our lead story, the philosophy should boil down to: "You can't trust humans."