Improving passenger safety means changing how technology is designed
Many aspects of air travel are completely automated, which is great most of the time, but leaves pilots unprepared when technology fails. We need to change how technology is designed to help pilots cope with unexpected situations.
AP Photo/Donna McWilliam
Technology has revolutionized the aviation industry, and nowhere more profoundly than in the area of safety.
Improvements in propulsion mechanics have helped reduce the rate of in-flight engine failures from 1 case per 5,000 hours in 1970 to just 1 per 100,000 hours today. Structural engineering advances mean catastrophic failures due to airframe corrosion do not necessarily render an aircraft inoperable. And in the event of a crash, revolutionary fire resistant material can keep cabin fires at bay long enough to allow passengers to evacuate the aircraft.
Numbers bear out the benefits of technological progress. In 2014, more than 38 million commercial flights took to the skies, the highest on record. Yet the global jet accident rate sits at one accident for every 4.4 million flights – the lowest in history.
Technology has also introduced challenges. Safety experts warn that an increasing reliance on technology can degrade basic flying skills. Warren VanderBurgh, a former aviator, famously opined that present-day pilots had turned into “children of the magenta,” a reference to the magenta-colored lines on cockpit screens that are used to guide modern airliners. The ability to predict how humans interact with technology has also not kept up with the pace of technological progress itself.
This explains why some systems aimed at improving safety have in fact led to accidents. In 2013, a jetliner crashed during a landing attempt in San Francisco. A subsequent investigation found that the pilot had changed an engine power setting with one goal in mind, without realizing the onboard computer assumed that another was desired. Three passengers were killed and more than 200 injured.
System designers have labored to address these issues. But there is another danger, one that is increasingly common yet seldom discussed.
The ‘fly-by-wire’ system is a modern engineering marvel. A computer linking the cockpit to aircraft control surfaces continuously analyzes pilot inputs. If the pilot pulls the control stick back, the computer recognizes that the pilot wants to climb and raises the aircraft’s nose. Such maneuvers are always performed within limits deemed safe by the aircraft manufacturer. In fact, the system is so advanced that it prevents the pilot from executing maneuvers that jeopardize the aircraft’s safety. The system does not however, stop a pilot from attempting to do so. That freedom still exists and it is one that is hardly unique to fly-by-wire. More and more, system designers are opting to provide technological solutions that obscure the impact of human error rather than prevent those errors from occurring. Given record low accident rates, the general sentiment seems to be ‘if it ain’t broke, don’t fix it.’
Such thinking can be problematic, according to Greg Jamieson, a professor of industrial engineering at University of Toronto. An expert on how humans and machines interact, Mr. Jamieson questions whether or not designers should be satisfied that their inventions have minimized the impact of human error. Or is there, as he points out, “an additional responsibility to ensure those errors do not occur in the first place?”
Meeting this responsibility is important should technology fail. And technology does fail. In 2014, a technical glitch in an air traffic control system caused the grounding or delay of hundreds of flights in the United States. A similar incident occurred a year earlier in India. In that instance, radar screens that air traffic controllers rely on to direct airplanes went blank for more than nine minutes. Human intervention was needed in both cases to resolve the situation.
Ensuring such intervention is safe means shifting focus from designing systems that accommodate undesirable human behavior to developing systems that change it. Such an approach has seen previous success. For example, the airline industry has long struggled with what travel writer Spud Hilton calls, “luggage that is more the size of a clown car than a carry-on.” Oversized carry-ons force passengers to check in their bags due to limited overhead cabin space. This, in turn, causes flight delays. Airlines are responding by imposing penalties to dissuade passengers from not complying with carry-on rules. United Airlines for example, sends some passengers back to the ticket counter to check in oversized carry-on luggage for a $25 fee. So does Toronto-based Air Canada. These policies are designed, in the words of one airline executive, “to reshape passenger behavior.”
When it comes to designing the next generation of technology, engineers would be wise to follow their lead.