When Science Goes Wrong: Twelve Tales from the Dark Side of Discovery by neuroscientist Simon LeVay (Plume, 2008) is fascinating reading for those of us who take an interest in the impact of science on society. LeVay presents 12 stories of disaster in a range of scientific and technological fields such as medicine, engineering, psychology, meteorology, forensic science, and volcanology, and over a period of time from 1928 to the present.
The failures occurred for a variety of reasons, raising a number of philosophical and policy questions. Some of the failures appear to be mostly due to individual recklessness. Geologist Stanley Williams ignored warnings and scorned protective gear while leading an expedition into the crater of an active volcano in 1993, causing the deaths of 9 people. A barrage of recriminations and justifications followed. Was Williams a daring innovator making invaluable contributions to science that could not be obtained in any other way? Or did a “culture of daredevilry” in the field of volcanology lead to bad science and unnecessary risks?
Other stories illustrate design flaws or ignorance of scientific principles that were only understood decades later. In 1928, the St. Francis Dam, built by water engineer William Mulholland to fuel the growth of the fledgling city of Los Angeles, failed catastrophically, leaving a 40-mile swath of death and destruction. A modern analysis attributed the failure to ignorance of the principle of hydrostatic uplift, which causes water seeping into the foundation to exert upward pressure and destabilize the dam.
Another design flaw described by LeVay ocurred in a nuclear reactor. In 1961, three operators died at the National Reactor Testing Station on Snake River Plain, Idaho, when one of them pulled out a control rod beyond its sixteen-inch safety limit, causing a runaway chain reaction. The reason was never established – theories ranged from a lack of training to murder-suicide, but clearly the design of the control rod made it too susceptible to human error.
These particular design flaws have been corrected in modern dams and nuclear reactors. But what principles are we not fully understanding today? Which of these failures of understanding will lead to disasters that will be clearly explained only in retrospect, by future analysts? In addition to unknown design flaws, LeVay’s stories invite us to consider the gamut of possible human errors: shortsightedness, arrogance, incompetence, fatigue, blind ideology, poor training, low morale, greed, ambition, lack of resources, failure to heed warnings, pressure for quick results, out-and-out fraud, and plain bad luck.
LeVay, a scientist himself, is not calling for a halt to all scientific and technological endeavor, or for rejecting science in favor of, say, creationism. But his book does give cause for considering what the magnitude of the stakes would be for a given scientific activity if something were to go wrong.
This brings me to the discussion of nuclear power begun by my co-blogger Gina in her last post and continued in a comment by red craig with a cogent defense of nuclear energy as a source of clean power. I agree with red craig that coal can hardly be called “safe.” But I believe the comment oversimplifies the question of nuclear waste disposal, and there are also other major policy issues.
Physics Today, in an analysis of Barack Obama’s and John McCain’s energy policy positions, notes: “Nuclear power represents more than 70 percent of our non-carbon generated electricity. It is unlikely that we can meet our aggressive climate goals if we eliminate nuclear power from the table.” But, the blog notes, “there is no future for expanded nuclear without first addressing four key issues: public right-to-know, security of nuclear fuel and waste, waste storage, and proliferation.”
Kmareka.com has a series of posts about a nuclear accident in Rhode Island in 1964 that caused the death of an inadequately trained operator. The posts consider the long-term waste-disposal problem posed by the site and the liability to taxpayers under the Price-Anderson Act for disasters of unimaginable magnitude.
The recent controversy over the potential nuclear waste disposal site in Yucca Mountain, Nevada, also raises a number of questions. An ad released by the Barack Obama campaign criticizes John McCain for supporting the development of the site, while showing video of McCain objecting to the presence of nuclear waste in Arizona. The expense for storing nuclear waste there is also enormous. According to a recent report, the Yucca Mountain project will cost $38.7 billion more than was originally anticipated.
LeVay concludes his book by observing “There, but for the grace of God, go I.” Errors occur constantly in the practice of science – they just usually don’t lead to major catastrophes. LeVay raises the question: Can – and should – anything be done to make science go wrong less often? He argues for an appropriate level of regulation and strict oversight, especially when science is applied to human needs and activities, which is when most of the disastrous consequences are likely to occur.
The current climate in our nation – hostility to regulation, reliance on the free market to solve all problems, and starvation of the national budget through extreme tax reduction and gargantuan military expenses – is hardly conducive to strict oversight of complex and potentially dangerous scientific activities. Hopefully, that climate will change with the next administration. But are we confident that we can trust the administration after that, and after that, and for the next thousand or so years that nuclear waste will be around, to keep us and our descendents safe?
Thursday, August 14, 2008
Review: When Science Goes Wrong
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment