Imagine you are stuck in an airport, your flight delayed. The weather is appalling, belting down rain in sheets, with gusts of wind so strong that several aircraft have had to abort their landing. No flights are taking off. Suddenly, your flight is called to the gate. Strangely, it is the only one called. Never mind, you are just happy to be getting out of there. While standing in line for economy check-in you spot the flight crew walking through the priority lane. As they pass, you strain to overhear them.
“… When flying B-52s, I never once diverted for a little weather front, so I’m damned if I’m about to start now.”
“But captain,” the first officer says. “No one else is taking off.”
“Bunch of sniveling babies! We’ll show them how to fly.”
Are you still happy to board that flight?
When you’re about to put your life in the hands of an airline captain, it isn’t just a macho attitude that should concern you. You should be equally worried if you overheard him say that he didn’t want to take off in this weather, but Air Traffic had assigned him a slot, so he felt he had little choice but to fly into this mess. Or what about if the captain had told the first officer not to worry, as things usually turned out fine – after all, most of the surviving passengers had walked away largely unscathed from their previous crash.
You will be pleased to learn that today’s aircrews are unlikely to hold such attitudes. Selection procedures, as well as rules and systems, ensure that such attitudes are rare within professional aviation.
However, there is another reason why such attitudes are rare; the airline industry actively trains aircrew to recognize and avoid such hazardous attitudes.
This has not always been so. In the past, the aviation industry has harbored cultures that supported several hazardous attitudes. How had this come about?
Airline pilots were once revered as god-like heroes. This high status led many pilots to develop potentially hazardous attitudes. In addition, a culture developed that made it difficult to challenge such attitudes. However, a spate of air disasters caused the industry to re-evaluate its culture.
One of the key people who led this re-evaluation was Alan Diehl, an aviation psychologist who developed a training program to improve decision-making. It was based upon a series of questions that were designed to reveal a pilot’s vulnerability to various hazardous attitudes.
Pilots were placed into a hypothetical situation, then selected the reason that they would have ended up in that situation. For example, consider the question below:
You are on a flight to an unfamiliar, rural airport. Flight service states that continued flight is not recommended, since heavy coastal fog is forecast to move into the destination airport about the time you expect to land. You first consider returning to your home base, where visibility is still good but decide instead to continue as planned. Why did you reach this decision?
- You hate to admit that you cannot complete your original flight plan.
- You resent the suggestion by flight service that you should change your mind.
- You feel sure that things will turn out safely, that there is no danger.
- You reasoned that since your actions would make no real difference, you might as well continue.
- You feel the need to decide quickly, so you take the simplest alternative.
Few pilots would readily admit to behaving in the above way. However, by forcing pilots to make a choice, the questionnaire revealed pilots’ vulnerability to the following hazardous attitudes:
These vulnerabilities were then presented to the student pilots, as in the example below:
Relative vulnerabilities of a student pilot. This individual is particularly vulnerable to
impulsiveness and should guard against situations demanding a quick response.
Diehl used this information to develop a program of antidote approaches. As a result of measures like this, plus other programs, aviation is now the safest form of transportation.
What software development can learn from aviation safety
If you look at the existing cultures in software development teams today, you will often see one or more of these potentially hazardous attitudes. Many of us can recall working with a macho or impulsive project manager, a developer with anti-authority issues, or a tester who professed feelings of helplessness.
These attitudes are usually unintended. After all, nobody wakes up and decides to behave in a hazardous manner at work that day. Nevertheless, many of us do end up developing hazardous attitudes. Like the aviation industry, these hazardous attitudes are often a reflection of the culture we work in. Therefore, to change hazardous attitudes, an organization needs to change part of its culture.
We have studied the work of Diehl and other aviation psychologists, to develop a survey and training program that reveals potentially hazardous attitude cultures in software development. In our next step, we aim to develop a program to address our vulnerability to these hazardous attitudes.
The survey takes about 20 minutes to complete, is confidential, and is currently free. The results will inform you of your relative vulnerability to five hazardous attitude cultures in software development, plus give you some guidance about how you can counter your vulnerabilities. If you are interested in taking the survey, follow this link.
This post first appeared on the Expleo blog here.