Return to site

An Analysis Of The Boeing 737 MAX Accidents

· Boeing,Aviation,Aviation Safety,Aviation Technology,Flight

An Analysis Of The Boeing 737 MAX Accidents

The cause of the Boeing 737 MAX accidents is a combination of over reliance on software-based decisions with inadequate human training so that the pilots were not aware of the need to turn off the computer system if they had the failure of a single sensor. The entire MCAS system relies upon one small sensor that if damaged would have catastrophic consequences. Pilots who assumed that they could 100% rely on the computer, were doomed. It is an important lesson in how software installations alone can not be implemented without proper training for both the end user and those responsible for maintenance and upkeep and a should be a warning to other companies eyeing automation.

Boeing and Airbus occupy a large share of the market and eclipse each other's share. In 2010, Airbus announced that it would update its main model: the A-320. The engine became larger, but the fuel efficiency increased by 15%, which would save the airlines a ton in operating expenses as fuel has been roughly 40% of an airline's total expenses. In addition, because other changes in the aircraft were minimal, pilots could theoretically use the aircraft safely after very little new training, further saving the airlines money.

Boeing's quickest response was to switch to a new engine in order to upgrade its main model, the 737-800. But the A320 has a higher frame than the 737-800, so the A320 can easily load a larger engine, but the 737-800 can not.

However, Boeing soon announced that they had found a way to raise the engine a bit to avoid touching the ground. As you can see, the engine of the new MAX series is higher than the wing surface.

Besides, Boeing also announced that the new MAX is almost identical to the old one, and the pilot can drive without much training. As a result, Boeing quickly reversed the market trend and the orders came flooding in.

But, the worry is that when the aircraft is operated normally, its front is lifted higher than before. If the front of the aircraft is lifted too high, the aircraft cannot get enough lift thrust and it becomes easy to crash.

broken image

Therefore, Boeing proposed a new solution, which is installing sensors in the head position and using the sensor data to set up the software, allowing the aircraft to automatically lower the angle without the intervention of the pilot. Boeing named the system MCAS.

broken image

Later investigations revealed that the MAX aircraft may perform a non-commanded bow for up to 10 seconds under computer control due to the MCAS system error, which is difficult to pull back by the pilot using the steering column alone.

And if the pilot is not aware of this, then it may become a dive - artificially leveled up - the aircraft continues to dive, but the pilot's chances of winning are small. That is why the software should be responsible.

After these two tragedies, Boeing announced that the MCAS system had been modified. A one-hour software patch installation is available, plus some pilot training courses and extremely detailed information is disclosed on the official website.

broken image

This is the latest version of the software patch: the above is the angle of attack data, the following is the angle of attack inconsistency warning.

From this incident, we have seen that through digital software transformation, Boeing has greatly improved data utilization and production efficiency. However, such digital transformations are not all good, it is easy to create an "automated paradox".

Automation is easy to operate, that is, even if the operators are not professional enough, they can still work for a long time with less training than before. Then the original operational skills will be degraded due to the neglect of practice. However, at the same time, automation also causes an automatic error. If the operator is not skilled enough, they cannot cope with these emergencies.

So, we need to seriously discuss how to prevent the automated paradox, how to avoid risks when software systems fail or when flight conditions are misjudged and systems are not maintained adequately. Maybe automation in aviation needs to slow down?

Written & Edited by Alexander Fleiss & Wanqi Zhu

Sources: