Feature

Flying remains an extremely safe mode of transportation: On average, a person is 65 times more likely to die in a car traveling the same distance as an airliner, statistics show.

But when accidents do occur, they are often blamed on human error. However, a book by NASA psychologist Key Dismukes, PhD, and Ben Berman and Loukia Loukopoulos, PhD, of the San Jose State University Research Foundation takes a more nuanced look at this assumption, showing that a complex web of factors likely contribute to such incidents.

The three are uniquely qualified to assess these incidents: Dismukes is chief scientist for aerospace human factors in the Human-System Integration Division at NASA-Ames; Berman, a captain at Continental Airlines, was previously chief of major accident investigations at the National Transportation Safety Board (NTSB), and Loukopoulos is a former Navy aviation psychologist.

Their book, "The Limits of Expertise: Rethinking Pilot Error and the Causes of Airline Accidents" (Ashgate, 2007), dissects the 19 major accidents in the United States that the NTSB attributed to crew error between 1991 and 2000. (The NTSB attributed the other 18 accidents that occurred in the same period to mechanical failures.)

Their conclusions? The errors blamed on skilled experts, such as airline pilots, are most often manifestations of inherent weaknesses and limitations in the overall air transport system-factors such as inadequate information, competing organizational goals, time pressure and the limitations of human information processing. For example, juggling multiple tasks at the same time can significantly increase human vulnerability to error, the authors write.

To improve the system, the authors say airlines should identify and analyze the threats pilots routinely encounter and provide tools to help manage them. For example, when bad weather forces pilots to decide whether to continue to a destination or divert, airlines could draw on research on plan continuation bias, the tendency of people to continue with plans that are no longer viable.

Further, since some amount of human error is inevitable, airlines should train pilots to catch and manage slipups so they don't escalate into accidents, the authors say. One way to aid this process is to apprise pilots, managers and instructors on how task demands and organizational policies can interact with human cognitive processes to drive pilots' decision-making, Dismukes says.

Finally, airlines, regulatory agencies and the flying public should acknowledge the inherent tension between airlines' safety goals and the public demand to arrive on time at a planned destination. Where and how that tension is balanced should be a part of public policy, Dismukes believes. Psychological researchers can help to elucidate the factors that affect this balance by studying the interaction of skilled performance with real-world task demands and organizational factors, he says.

"You get what you invest in," Dismukes maintains. "If you want to maintain and improve flight safety, you have to invest in human factors research."

--T. DeAngelis