Pilot error (sometimes called cockpit error) is a term used to describe a decision, action or inaction by a pilot or crew of an aircraft determined to be a cause or contributing factor in an accident or incident. The term includes mistakes, oversights, lapses in judgment, gaps in training, adverse habits, and failures to exercise due diligence in a pilot's duties.
Contents
- Description
- Causes of pilot error
- Threats
- Errors
- Decision making
- Psychological illness
- Threat and Error Management TEM
- Line Operations Safety Audit LOSA
- Crew Resource Management CRM
- Application and effectiveness of CRM
- Cockpit Task Management CTM
- Checklists
- Notable examples
- References
The causes of pilot error include psychological and physiological human limitations, and various forms of threat and error management have been implemented into pilot training programs to teach crew members how to deal with impending situations which arise throughout the course of a flight.
Accounting for the way human factors influence the actions of pilots is now considered standard practice by accident investigators when examining the chain of events that led to an accident.
Description
Usually in an accident caused by pilot error, it is assumed that the pilot in command (captain) makes an error unintentionally. However, an intentional disregard for a standard operating procedure (or warning) is still considered to be a pilot error, even if the pilot's actions justified criminal charges.
Pilot error is a decision or action mistake by pilots when there is an emergency. As the commander of the aircraft, the pilot is always regarded to be one of the most important factors. The decisions of the pilot determine everything on the craft, and can be affected by countless external elements. Analyses of accident patterns can have value for the improvement of passengers safety.
The pilot may be a factor even during adverse weather conditions if the investigating body deems that the pilot did not exercise due diligence. The responsibility for the accident in such a case would depend upon whether the pilot could reasonably know of the danger and whether he or she took reasonable steps to avoid the weather problem. Flying into a hurricane (for other than legitimate research purposes) would be considered pilot error; flying into a microburst would not be considered pilot error if it was not detectable by the pilot, or in the time before this hazard was understood. Some weather phenomena (such as clear-air turbulence or mountain waves) are difficult to avoid, especially if the aircraft involved is the first aircraft to encounter the phenomenon in a certain area at a certain time.
Placing pilot error as a cause of an aviation accident has often been controversial. For example, the NTSB ruled that the crash of American Airlines Flight 587 was because of the failure of the rudder, which was caused by "unnecessary and excessive rudder pedal inputs" on the part of the co-pilot who was operating the aircraft at the time. Attorneys for the co-pilot, who was killed in the crash, argue that American Airlines' pilots had never been properly trained concerning extreme rudder inputs. The attorneys also claimed that the rudder failure was actually caused by a flaw in the design of the Airbus A300 aircraft and that the co-pilot's rudder inputs should not have caused the catastrophic rudder failure that led to the accident that killed 265 people.
Modern accident investigators avoid the words "pilot error", as the scope of their work is to determine the cause of an accident, rather than apportion blame. Furthermore, any attempt to blame pilots does not consider that they are part of a broader system, which in turn may be at fault for their fatigue, work pressure or lack of training. ICAO and its member states therefore adopted the Reason Model in 1993 in an effort to better understanding the role of human factors in aviation accidents.
Thus, pilot error is a major cause of air accidents. During 2004, pilot error was pointed to be the primary reason of 78.6% of disastrous GA (general aviation) accidents, and as the major cause of 75.5% of general aviation accidents in the US. Pilot errors are related to multiple causes. Decision errors can be caused by several factors such as tendencies, biases as well as breakdowns when human proceeds the information coming in. For pilot in aviation, these errors are highly to produce not only errors but also fatalities.
Causes of pilot error
Pilots work in complex environments and are routinely exposed to high amounts of situational stress in the workplace, inducing pilot error which may result in a threat to flight safety. While aircraft accidents are infrequent, they are highly visible and often involve massive loss of life. For this reason, research on causal factors and methodologies of mitigating risk associated with pilot error is exhaustive. Pilot error results from physiological and psychological limitations inherent in humans. “Causes of error include fatigue, workload, and fear as well as cognitive overload, poor interpersonal communications, imperfect information processing, and flawed decision making.” Throughout the course of every flight, crews are intrinsically subjected to a variety of external threats and commit a range of errors that have the potential to negatively impact the safety of the aircraft.
Threats
The term "threat" is defined as any event "external to flight crew's influence which can increase the operational complexity of a flight." Threats may further be broken down into environmental threats and airline threats. Environmental threats are ultimately out of the hands of crew members and the airline, as they hold no influence on "adverse weather conditions, hazardous , air traffic control shortcomings, bird strikes, and high terrain." Conversely, airline threats are not manageable by the flight crew, but may be controlled by the airline's management. These threats include "aircraft malfunctions, cabin interruptions, operational pressure, ground/ramp errors/events, cabin events and interruptions, ground maintenance errors, and inadequacies of manuals and charts."
Errors
The term "error" is defined as any action or inaction leading to deviation from team or organizational intentions. Error stems from physiological and psychological human limitations such as illness, medication, stress, alcohol/drug abuse, fatigue, emotion etc. Error is inevitable in humans and is primarily related to operational and behavioural mishaps. Errors can vary from incorrect altimeter setting and deviations from flight course to more severe errors such as exceeding maximum structural speeds or forgetting to put down landing or takeoff flaps.
Decision making
Reasons for negative reporting of accident include staff being too busy, confusing data entry forms, lack of training and less education, lack of feedback to staff on reported data and punitive organizational cultures. Wiegmann and Shappell invented three cognitive models to analyze approximately 4,000 pilot factors associated with more than 2,000 U.S. Naval aviation mishaps. Although the three cognitive models has slight difference in the types of errors all three lead to the same conclusion: judgment errors. There are three steps which are decision-making, goal-setting, and strategy-selection errors. All of those were highly related with primary accidents. For example, on December 28, 2014, AirAsia Flight 8501, which carrying seven crew members and 155 passengers, crashed into Java sea due to several fatal mistakes of the captain in the poor weather condition. in this case, the captain chose to adjust the flight altitude at the high rate which is not acceptable.
Psychological illness
The psychological treatment and requirements of pilots is always listed in aviation law and enforced by individual airlines. Facing multiple special challenges, pilots must exercise control in complicated environments. Psychological illness is typically defined as an unintended physical, mental, or social injury, harm or complication that results in disability, death, or increased use of health care resources. Due to physiological problems such as jet lag, pilots usually feel uncomfortable after long-hour flights. Psychological illness is regarded as a primary problem for pilots which had also caused several fatal accidents in the past.
SilkAir Flight 185 On 19 December 1997, Flight 185 crashed into the Musi River near Palembang in southern Sumatra. All 97 passengers and seven crew were killed on board. After the investigation of the accident, all the evidence pointed to the captain which was concluded to be a planned suicidal accident.
EgyptAir Flight 990 On 31 October 1999, the Boeing 767 crashed into the Atlantic Ocean south of Nantucket Island, Massachusetts. All 217 people on board were killed. Although the result had never been proved, the crash was considered as a deliberate action by the relief first officer.
Germanwings Flight 9525 On 24 March 2015, the aircraft, flight 9525 crashed in the French Alps. All 144 passengers and six crew members were killed. As the co-pilot of the plane, Andreas Lubitz had been treated for suicidal tendencies and been banned to work by a doctor. Lubitz hid this information from his employer. During the flight, the door was locked by Lubitz and the captain could not enter before Lubitz deliberately caused the aircraft to crash into a mountain.
Threat and Error Management (TEM)
TEM involves the effective detection and response to internal or external factors that have the potential to degrade the safety of an aircraft's operations. Methods of teaching TEM stress replicability, or reliability of performance across recurring situations. TEM aims to prepare crews with the "coordinative and cognitive ability to handle both routine and unforeseen surprises and anomalies." The desired outcome of TEM training is the development of 'resiliency'. Resiliency, in this context, is the ability to recognize and act adaptively to disruptions which may be encountered during flight operations. TEM training occurs in various forms, with varying levels of success. Some of these training methods include data collection using the Line Operations Safety Audit (LOSA), implementation of crew resource management (CRM), cockpit task management (CTM), and the integrated use of checklists in both commercial and general aviation. Some other resources built into most modern aircraft that help minimize risk and manage threat and error are airborne collision and avoidance systems (ACAS) and ground proximity warning systems (GPWS). With the consolidation of onboard computer systems and the implementation of proper pilot training, airlines and crew members look to mitigate the inherent risks associated with human factors.
Line Operations Safety Audit (LOSA)
LOSA is a structured observational program designed to collect data for the development and improvement of countermeasures to operational errors. Through the audit process, trained observers are able to collect information regarding the normal procedures, protocol, and decision making processes flight crews undertake when faced with threats and errors during normal operation. This data driven analysis of threat and error management is useful for examining pilot behavior in relation to situational analysis. It provides a basis for further implementation of safety procedures or training to help mitigate errors and risks. Observers on flights which are being audited typically observe the following:
LOSA was developed to assist crew resource management practices in reducing human error in complex flight operations. LOSA produces beneficial data that reveals how many errors or threats are encountered per flight, the number of errors which could have resulted in a serious threat to safety, and correctness of crew action or inaction. This data has proven to be useful in the development of CRM techniques and identification of what issues need to be addressed in training.
Crew Resource Management (CRM)
CRM is the "effective use of all available resources by individuals and crews to safely and effectively accomplish a mission or task, as well as identifying and managing the conditions that lead to error." CRM training has been integrated and mandatory for most pilot training programs, and has been the accepted standard for developing human factors skills for air crews and airlines. Although there is no universal CRM program, airlines usually customize their training to best suit the needs of the organization. The principles of each program are usually closely aligned. According to the U.S. Navy, there are seven critical CRM skills:
These seven skills comprise the critical foundation for effective aircrew coordination. With the development and use of these core skills, flight crews "highlight the importance of identifying human factors and team dynamics to reduce human errors that lead to aviation mishaps."
Application and effectiveness of CRM
Since the implementation of CRM circa 1979, following the need for increased research on resource management by NASA, the aviation industry has seen tremendous evolution of the application of CRM training procedures. The applications of CRM has been developed in a series of generations:
Today, CRM is implemented through pilot and crew training sessions, simulations, and through interactions with senior ranked personnel and flight instructors such as briefing and debriefing flights. Although it is difficult to measure the success of CRM programs, studies have been conclusive that there is a correlation between CRM programs and better risk management.
Cockpit Task Management (CTM)
Cockpit task management (CTM) is the "management level activity pilots perform as they initiate, monitor, prioritize, and terminate cockpit tasks." A 'task' is defined as a process performed to achieve a goal (i.e. fly to a waypoint, descend to a desired altitude). CTM training focuses on teaching crew members how to handle concurrent tasks which compete for their attention. This includes the following processes:
The need for CTM training is a result of the capacity of human attentional facilities and the limitations of working memory. Crew members may devote more mental or physical resources to a particular task which demands priority or requires the immediate safety of the aircraft. CTM has been integrated to pilot training and goes hand in hand with CRM. Some aircraft operating systems have made progress in aiding CTM by combining instrument gauges into one screen. An example of this is a digital attitude indicator, which simultaneously shows the pilot the heading, airspeed, descent or ascent rate and a plethora of other pertinent information. Implementations such as these allow crews to gather multiple sources of information quickly and accurately, which frees up mental capacity to be focused on other, more prominent tasks.
Checklists
The use of checklists before, during and after flights has established a strong presence in all types of aviation as a means of managing error and reducing the possibility of risk. Checklists are highly regulated and consist of protocols and procedures for the majority of the actions required during a flight. The objectives of checklists include "memory recall, standardization and regulation of processes or methodologies." The use of checklists in aviation has become an industry standard practice, and the completion of checklists from memory is considered a violation of protocol and pilot error. Studies have shown that increased errors in judgement and cognitive function of the brain, along with changes in memory function are a few of the effects of stress and fatigue. Both of these are inevitable human factors encountered in the commercial aviation industry. The use of checklists in emergency situations also contributes to troubleshooting and reverse examining the chain of events which may have led to the particular incident or crash. Apart from checklists issued by regulatory bodies such as the FAA or ICAO, or checklists made by aircraft manufacturers, pilots also have personal qualitative checklists aimed to ensure their fitness and ability to fly the aircraft. An example is the IM SAFE checklist (illness, medication, stress, alcohol, fatigue/food, emotion) and a number of other qualitative assessments which pilots may perform before or during a flight to ensure the safety of the aircraft and passengers. These checklists, along with a number of other redundancies integrated into most modern aircraft operation systems, ensure the pilot remains vigilant, and in turn, aims to reduce the risk of pilot error.
Notable examples
One of the most famous incidents of an aircraft disaster attributed to pilot error was the nighttime crash of Eastern Air Lines Flight 401 near Miami, Florida on December 29, 1972. The captain, first officer, and flight engineer had become fixated on a faulty landing gear light and had failed to realize that the flight controls had been bumped by one of the crew, altering the autopilot settings from level flight to a slow descent. Told by ATC to hold over a sparsely populated area away from the airport while they dealt with the problem (with, as a result, very few lights on the ground visible to act as an external reference), the distracted flight crew did not notice the plane losing height and the aircraft eventually struck the ground in the Everglades, killing 101 out of 176 passengers and crew.
The subsequent National Transportation Safety Board (NTSB) report on the incident blamed the flight crew for failing to monitor the aircraft's instruments properly. Details of the incident are now frequently used as a case study in training exercises by aircrews and air traffic controllers.
During 2004 in the United States, pilot error was listed as the primary cause of 78.6% of fatal general aviation accidents, and as the primary cause of 75.5% of general aviation accidents overall. For scheduled air transport, pilot error typically accounts for just over half of worldwide accidents with a known cause.
The final report stated that the accident resulted from the pilots’ lack of a common action plan during the approach, the final approach being continued below the Minimum Decision Altitude without ground visual reference being acquired, the inappropriate application of flight control inputs during the go-around and after the activation of the Terrain Awareness and Warning System, and the flight crew’s lack of monitoring and controlling of the flight path,