There is no small complexity in the task of carrying hundreds of people through the sky at hundreds of miles an hour. More than 100,000 airliners take off and land each day, but two deadly air crashes in six months have shocked passengers, regulators, and industry alike.
Crashes of Boeing’s 737 Max in Indonesia and Ethiopia offer a window into all that complexity. Boeing and its CEO Dennis Muilenburg want the story to be simple: a software problem that can be fixed with a quick patch. But that doesn’t capture the mistakes made by Boeing and American aviation regulators in certifying the plane to carry passengers.
By now, you may well have heard of MCAS, software that automatically pitches 737 Maxes downward to avoid stalling in mid-air. It exists only because Boeing wanted to upgrade its 737 without changing it fundamentally—so it added new engines that made the aircraft more likely to stall, rather than starting from scratch. In the emerging picture of the two accidents, the software failed because the mechanical sensor it depended on also malfunctioned.
But all that pales next to what will likely be the highlight of investigations into the incident: the training and user experience of the people in the cockpits. Pilots did not have sufficient training to understand how MCAS worked, and two vital safety features—a display showing what the sensor detected, and a light warning if other sensors disagreed—were optional extras.
Minimizing training and cockpit changes was an economic decision: The upgraded plane would be more attractive to potential purchasers if they did not have to spend expensive hours retraining their pilots. The Federal Aviation Administration determined Boeing’s training and safety plans were fine. Now, investigators want to know why. The answers could be costly for Boeing, and for America’s reputation as a leader in the safe deployment of aviation technology.
Software is easy to blame, because for many people, computer science is a mystery. But these crashes emerged from an experience we’re all familiar with: the pressure to deliver on a tight timetable, the temptation to cut corners, and the hope that in a big, complex world, one little kludge won’t mess up the whole program.
Crashes of Boeing’s 737 Max in Indonesia and Ethiopia offer a window into all that complexity. Boeing and its CEO Dennis Muilenburg want the story to be simple: a software problem that can be fixed with a quick patch. But that doesn’t capture the mistakes made by Boeing and American aviation regulators in certifying the plane to carry passengers.
By now, you may well have heard of MCAS, software that automatically pitches 737 Maxes downward to avoid stalling in mid-air. It exists only because Boeing wanted to upgrade its 737 without changing it fundamentally—so it added new engines that made the aircraft more likely to stall, rather than starting from scratch. In the emerging picture of the two accidents, the software failed because the mechanical sensor it depended on also malfunctioned.
But all that pales next to what will likely be the highlight of investigations into the incident: the training and user experience of the people in the cockpits. Pilots did not have sufficient training to understand how MCAS worked, and two vital safety features—a display showing what the sensor detected, and a light warning if other sensors disagreed—were optional extras.
Minimizing training and cockpit changes was an economic decision: The upgraded plane would be more attractive to potential purchasers if they did not have to spend expensive hours retraining their pilots. The Federal Aviation Administration determined Boeing’s training and safety plans were fine. Now, investigators want to know why. The answers could be costly for Boeing, and for America’s reputation as a leader in the safe deployment of aviation technology.
Software is easy to blame, because for many people, computer science is a mystery. But these crashes emerged from an experience we’re all familiar with: the pressure to deliver on a tight timetable, the temptation to cut corners, and the hope that in a big, complex world, one little kludge won’t mess up the whole program.
No comments:
Post a Comment