In recent news, there have been many stories about the Ethiopian airliner that crashed on land. A lot of people, including President Trump, have pointed the finger at the plane's software and the increasing automation of planes in general, saying that humans are needed to make split-second emergency decisions. But, was this crash really the fault of automation, or is increased automation of planes safer than human action?
The plane in question was a Boeing 737 MAX, a state-of-the-art mid-to-large sized plane, and Boeing's fastest-selling model. It comes equipped with the latest in safety technology and software. The thing that makes the 737 MAX unique, and different to its less-automated counterpart, the 737 NG, is a software by the name of the Maneuvering Characteristics Automation System (MCAS). Unlike the MCAS we're familiar with, this program is actually useful. Basically, the Boeing 737's engines are built differently than most planes in order to gain better fuel efficiency. However, this causes the plane's center of gravity to be slightly off, causing the plane to tilt upwards from time to time during certain maneuvers. What the MCAS does it detect this tilt (the Angle of Attack, or AOA) and automatically corrects it by tilting the plane downwards, so that the plane doesn't stall. This works just fine most of the time, and with the Federal Aviation Administration (FAA)'s regulations and safety checks, in the US, there has been a grand total of one passenger death on American Airliners between 2010-2019.
That's not to say the system is perfect. Obviously it is a tragedy that a plane crashed and people died, but pointing the finger at computers will not solve anything. Boeing made a terrible mistake in not informing their pilots of the difference between the 737 MAX and the 737 NG. They simply told the pilots that "it flies like the NG". In a similar event, just 6 months ago with the same model of plane, the MCAS miscalculated the plane's Angle of Attack and tilted downwards. The potential for miscalculation or error is well known, not just with something like the MCAS, but also with miscalculation of airspeed, turning angles, weather, etc. So, there is a way to disable the MCAS to allow for manual control by the pilot, as there is for almost every automated system within a plane. The problem is that Boeing never informed its pilots of the MCAS or how to disable it. The 737 NG has no MCAS, so pilots simply assumed that they can pull their control stick (yoke) to bring the plane up, not realizing they have to disable the MCAS and turn off the autopilot. It is too soon to know for sure what happened with the recent crash, but there is a good chance that it has the same cause as the last one, which is why so many pilots are angry about it.
Since the recovery of the plane's Black Boxes, it has been revealed that this particular crash was similar to the Lion Air crash, with the MCAS malfunctioning and the plane tipping downwards, while the pilots tried to pull up and failed to disable the MCAS. The problem here isn't the concept of the MCAS, or automation in general, or younger pilots "getting lazy" and relying on automation. The problem is that every situation must be accounted and prepared for accordingly. The pilots had no idea that the MCAS was even in place. If Boeing had simply informed the pilots of the system and what to do in case of a malfunction, this entire problem could have been avoided. Human reaction time and ability to adapt to changing situations IS important, which is why pilots are still necessary even in the most automated of planes, but blaming everything on technology is simply foolish. It is impossible for a system to improve without first testing it and finding the errors.
Errors on a plane, of course, could mean the death of hundreds, which is why the pilots are there, to disable to systems and take manual control in case anything goes wrong. Humans are reliable like technology is, only failing in certain extreme cases. It is simply nonsensical to expect any and all state-of-the-art, brand-new technology to go off without a single hitch on its first run. However, it is equally nonsensical to not prepare for the dangers, and Boeing went wrong in failing to rely on its pilots and inform them of the danger just as much as it relies on its technologies. Pilots are good. Technology is good. There is nothing wrong with automation, but there is with danger, which is why preparing for every eventuality is good. Soon enough, we may have automated systems that prepare for potential danger that human pilots cause, such as falling asleep at the cockpit or having some kind of medical emergency. The problems in these cases started before the plane ever took off, with Boeing failing to inform the pilots of this new technology. True synchronization between man and machine in its current state would allow for the most efficient, safest, best flying available, as is the case with most automated industries.
Interesting article. It is really reflective of what we as humans are facing in the era of high-technology. Where is the balance and who is to blame when things go wrong may be questions we are facing on a daily basis in every field imaginable.
ReplyDelete