I have some experience with the design and development process for software and semiconductors, so the layman’s typical surprise at the appearance of bugs in programs and microprocessors always pains me. Software and integrated circuits are essentially machines with thousands to millions of components — their construction and deployment is a mixture of prior art, educated guesses, and crossed fingers. It’s a miracle that anything works at all.
So I was enlightened and horrified to find, after reading Henry Petroski’s To Engineer is Human: The Role of Failure in Successful Design, that this uncertainty is not unique to the newer engineering disciplines. Mr. Petroski recounts and dissects high-profile failures through the ages, ranging from the intriguing (evidence that some of the ancient pyramids were built at too steep an angle), to the expensive (pothole-induced mechanical failure in New York City transit buses), to the life-threatening (cracking coolant pipes in nuclear reactors).
These failures highlight the fact that buildings and bridges don’t collapse around us all the time, and the airline industry as a whole has an exemplary safety record. The author attributes much of the stability that we take for granted to conservative design, adding safety fudge factors, and replication — most houses don’t break any new ground in civil engineering, bridges are designed to handle several times more than the expected loads (just in case), and once a plane has been deemed safe to fly, many duplicates are manufactured.
Safety doesn’t necessarily involve trading off innovation. The author cites successful high-profile projects include the Brooklyn Bridge and the much-copied Crystal Palace built for London’s Great Exhibition. The success of these creative designs are due not only to careful analysis, planning, and execution (during construction of the Crystal Palace, every girder was tested on arrival at the site before installation), but to diligent study of past projects, successful and not.
The element of postmortem analysis gives a detective mystery flavor to this book. The arduous and belated investigation into the crashes of the early Comet jetliner was hampered by the difficulty in finding the submerged remnants of the disintegrated planes and resolved only after running thousands of compression/decompression tests on the cabin by filling it with water (demonstrating that the riveted areas around the windows would eventually crack). Recovery of the overturned North Sea oil rig was an engineering feat in itself, and examination of all the structures revealed the culprit — the stress fracture that shouldn’t have occurred in one of the support legs was painted over during the initial welding and grew to a critical size after being pounded millions of times by ocean waves.
On a lighter note, Mr. Petroski recounts an engineer’s analysis of the mythical Icarus’s first and final flight. Experimentation and analysis similar to that applied to other aerospace accidents provides an alternative but no less interesting interpretation of that story. The author also recounts his attempt to track down a possibly alternate originator of the quote attributed to Santayana, “Those who cannot remember the past are doomed to repeat it.” Despite Mr. Petroski’s beautifully literate writing throughout the book, this last personal tale is not that interesting, but it does serve to emphasize the main point of this book — successful design is built on past failures.
This book was published shortly after the Challenger explosion and contains an afterword referencing that accident. Santayana’s dictum is especially poignant in the aftermath of the second space shuttle disaster. The question remains — engineering failures leading to such events can be tracked down and fixed, but what about the institutional practices that masked these problems?