Aviation Human Factors Practitioner

Robert R. Tyler, PhD
Crown Consulting, Inc.

While it is true that I was "doing" human factors long before I knew what it was, and certainly before I was designated a human factors practitioner, let's begin by understanding that I did not plan to have a career in human factors psychology. I was an unsettled physics major during an unsettled time (the mid-sixties) who found himself midway through college learning to become a combat helicopter pilot. It was in a hostile foreign land that I learned first-hand the importance of good user-centered design processes. The aircraft that I flew in Vietnam was new, capable and ostensibly designed to accomplish our combat mission. However, none of the operational procedures said anything as to how we would actually modify the prescribed flight profiles to get into and out of "hot" landing zones. Survival dictated that we minimize the time spent in slow and predictable flight trajectories.

So we abused the automation and modified the procedures by intentionally selecting in-flight rotor reprogramming positions — essentially creating two 52-diameter speedbrakes. We loved it until the back ends of the helicopters began falling off. The engineering solution: put in a speed sensor switch that prevents programming the rotors to the aft settings above 70 knots. The combat pilot's work-around: put your hand out the window and place it over the speed sensor to fool the system into believing it is below 70 knots. One additional glitch, the designers placed an identical switch next to the rotor trim switch that controlled the aircraft's stabilization system. Two clicks to the right of the rotor trim switch reprogrammed the rotors; two clicks either way of the stab-aug switch turned it off and caused the back end to want to swap places with the front end, thus guaranteeing real excitement when you least needed it!

Years later, with an aviation safety degree in hand, I found myself buying and modifying venerable transport aircraft. As I sought to import technology into this 1950s airframe, issues of real estate, integration and user habits/expectations surfaced. Most notable was an airborne navigator lament over the non-glare surface of his new color radar display. It seems he couldn't erase the grease pencil marks that he put on the screen during the conduct of rendezvous, despite the fact that the system was designed with all of the electronic "bells and whistles" to perform that function without the use of a grease pencil.

As a U.S. Marine Corps naval aviator, I was directly concerned with pilot error and aviation safety issues for over 30 years. At times, I was part of the problem (as in the first example), and other times I sought to be part of the solution. As an instructor pilot, simulator instructor and training squadron commander, I was where the "rubber meets the runway" on developing new pilots' skills in situational awareness, cockpit resource management and aeronautical decision making while teaching them the basic stick and rudder skills. In my role as an acquisition executive, it was my goal to acquire training devices, aircraft systems and decision support tools that enhanced the pilots' ability to understand their immediate flight environment.

I have watched the human factors discipline evolve as an applied science within aviation. At first, human factors was about fitting humans into specific cockpits. Later, we began to focus on those "life stressors" that could distract a pilot and ultimately cause a mishap. The infusion of total quality management and airlines? Cockpit resource management principles into military flight operations continued to fuel the evolution towards a user-centered environment. Nonetheless, the issue of pilot error remained. As the Marine Corps aviation safety director, I noted that during Desert Shield/Storm, once again, more aircraft crashed avoiding suspected enemy fire than were shot down. As I reviewed mishap reports and sat on boards determining aviator suitability for continued flight duties, I was increasingly plagued with the conundrum of why highly trained, physically fit, well-disciplined aviators would end up flying their superbly maintained, perfectly functioning, state-of-the-art flying machines into the ground. In search of answers to this question I found myself enrolled in a terminal degree program in human factors psychology.

I have seen the effects of poor design. I have been exposed to a variety of aircraft and flight domains. I have observed good and bad aviation safety practices and I was fortunate enough to be in positions where I could introduce procedures and methodologies that created safer flight environments. Clearly it has been an exciting, fulfilling and circuitous route to this point in my career—one that I could not have charted or anticipated. Currently, I am an adjunct professor in Embry-Riddle extended campus program and a human factors consultant to the Federal Aviation Administration. In both capacities, I continue to explore the challenges associated with infusing consideration of human capabilities and limitations into aviation environments.

 
(Originally published in the September/October 2000 issue of Psychological Science Agenda, the newsletter of the APA Science Directorate.)