Normal Accidents
Normal Accidents: Living with High-Risk Technologies is a 1984 book by Yale sociologist Charles Perrow, which analyses complex systems from a sociological perspective. Perrow argues that multiple and unexpected failures are built into society's complex and tightly coupled systems, and that accidents are unavoidable and cannot be designed around.[1][2] System accidents"Normal" accidents, or system accidents, are so-called by Perrow because such accidents are inevitable in extremely complex systems. Given the characteristic of the system involved, multiple failures that interact with each other will occur, despite efforts to avoid them. Perrow said that, while operator error is a very common problem, many failures relate to organizations rather than technology, and major accidents almost always have very small beginnings.[3] Such events appear trivial to begin with before unpredictably cascading through the system to create a large event with severe consequences.[1]
Perrow identifies three conditions that make a system likely to be susceptible to Normal Accidents. These are:
Three Mile IslandThe inspiration for Perrow's books was the 1979 Three Mile Island accident, where a nuclear accident resulted from an unanticipated interaction of multiple failures in a complex system.[2] The event was an example of a normal accident because it was "unexpected, incomprehensible, uncontrollable and unavoidable".[5]
New reactor designsOne disadvantage of any new nuclear reactor technology is that safety risks may be greater initially as reactor operators have little experience with the new design. Nuclear engineer David Lochbaum has said that almost all serious nuclear accidents have occurred with what was at the time the most recent technology. He argues that "the problem with new reactors and accidents is twofold: scenarios arise that are impossible to plan for in simulations; and humans make mistakes".[6] As Dennis Berry, Director Emeritus of Sandia National Laboratory[7] put it, "fabrication, construction, operation, and maintenance of new reactors will face a steep learning curve: advanced technologies will have a heightened risk of accidents and mistakes. The technology may be proven, but people are not".[6] Sometimes, engineering redundancies which are put in place to help ensure safety may backfire and produce less, not more reliability. This may happen in three ways: First, redundant safety devices result in a more complex system, more prone to errors and accidents. Second, redundancy may lead to shirking of responsibility among workers. Third, redundancy may lead to increased production pressures, resulting in a system that operates at higher speeds, but less safely.[8] ReadershipNormal Accidents has more than 1,000 citations in the Social Sciences Citation Index and Science Citation Index to 2003.[8] A German translation of the book was published in 1987, with a second edition in 1992.[9] See also
Literature
References
|