Case Study 5: The Crash Of Asiana Flight 214 Plg1
72 Case Study 5 The Crash Of Asiana Flight 214 Plg1
The crash of Asiana Flight 214 on July 6, 2013, was a highly publicized aviation accident that primarily involved issues related to automation systems. Of the 307 individuals onboard the Boeing 777 aircraft, three tragically lost their lives, while 187 were injured, including 49 with serious injuries. Throughout the flight, weather conditions were ideal for a visual approach, which the pilots elected to execute. Despite their extensive experience, the incident unfolded unexpectedly, leading to a critical examination of what went wrong.
This analysis focuses on applying systems safety techniques to understand one of the system malfunctions that contributed to the crash. By examining the interaction between automation systems and pilot decision-making, we can identify key failures and recommend strategies for prevention. The investigation utilizes data from video footage, accident reports, and credible sources to construct a comprehensive picture of the system failures involved.
The primary malfunction examined is related to the Airbus automated flight control systems, which, in this case, included the autopilot and autothrottle systems. During the final approach, pilots inadvertently relied heavily on automation without adequate manual intervention when they faced difficulties in controlling the aircraft's descent rate and speed. An analysis using the Fault Tree Analysis (FTA) method revealed that the over-reliance on automation, coupled with ambiguous interface cues, led to a series of failures that culminated in an overshoot of the runway and crash.
Paper For Above instruction
The Asiana Flight 214 accident underscores several critical issues regarding automation dependence and pilot-system interaction. The Fault Tree Analysis applied demonstrates how multiple failures, including inadequate training on automation, misinterpretation of system alerts, and insufficient manual flying skills, converged to cause the crash. The FTA revealed that the primary root cause was the pilots' mismanagement of automation systems, particularly their failure to disengage the autopilot and autothrottle at the appropriate moments and their misunderstanding of the system cues during the final approach.
Specifically, the pilots lost situational awareness due to ambiguous automation interface cues, which failed to clearly indicate that the aircraft was approaching the maximum auto-throttle limit. Further, the automatic system's design did not adequately alert the pilots to manual control options or provide sufficient feedback on system status changes. As a result, the pilots continued with automation reliance despite the aircraft's speed decreasing below safe thresholds, leading to a stall and subsequent crash landing.
To address these issues, the following corrective measures are recommended: First, enhanced pilot training that emphasizes manual flying skills and appropriate automation management should be prioritized. This includes simulation training for manual control during critical phases and explicit scenarios where automation disengagement is necessary. Second, the automation interface should be redesigned to improve clarity and provide more explicit alerts when approaching operational limits. For example, visual and auditory warnings should more effectively communicate critical system statuses and encourage pilots to take manual control when necessary.
Furthermore, airlines should incorporate Crew Resource Management (CRM) training that fosters better communication and teamwork during automation-related procedures, reducing the risk of misinterpretation. Finally, regulatory authorities should review automation interface standards to ensure they provide clear, unambiguous cues that guide pilot actions and mitigate reliance on automation without proper oversight.
In conclusion, the Asiana Flight 214 crash highlights the dangers of excessive reliance on automation without adequate manual flying skills or interface clarity. Applying systems safety techniques like Fault Tree Analysis reveals how failure points in automation design and pilot training can converge to produce catastrophic outcomes. By implementing targeted improvements in automation interface design, pilot training, and procedural protocols, similar accidents can be prevented in the future, ensuring safer aviation operations.
References
- Yamamoto, S., & Kondo, H. (2014). Analysis of automated flight systems and pilot decision-making in the Asiana Flight 214 accident. Journal of Aviation Safety, 12(3), 45-58.
- National Transportation Safety Board (NTSB). (2014). Aircraft Accident Report: Asiana Airlines Flight 214. NTSB/AAR-14/01.
- Ericson, C. A. (2015). Safety variables in modern aviation: A systems approach. Wiley.
- Ferguson, N., & Hunter, J. (2016). Human factors and automation in aviation. International Journal of Aviation Psychology, 26(2), 101-113.
- Salas, E., & Maurino, D. (2010). Human factors in aviation safety. Academic Press.
- Reason, J. (1997). Managing the risks of organizational accidents. Ashgate Publishing.
- Helmreich, R. L., & Merritt, A. (2000). Culture at work in aviation and medicine: National, organizational, and professional influences. Ashgate Publishing.
- Rasmussen, J. (1994). Risk management in a dynamic society: A modelling problem. Safety Science, 17(3), 27-50.
- Churyk, N. T., & Lukacs, P. (2012). The impact of automation on pilot situational awareness and decision-making. Journal of Air Transportation, 20(2), 29-42.
- ICAO. (2016). Human Factors Digest: Automation and pilot interaction. International Civil Aviation Organization.