1 / 25

The Limits of Expertise: The Misunderstood Role of Pilot Error in Airline Accidents

The Limits of Expertise: The Misunderstood Role of Pilot Error in Airline Accidents. Key Dismukes NASA Ames Research Center Ben Berman and Loukia Loukopoulos San Jose State University Foundation at NASA Ames Research Center ALPA Air Safety Week 19 August 2004. Approach.

jana
Télécharger la présentation

The Limits of Expertise: The Misunderstood Role of Pilot Error in Airline Accidents

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Limits of Expertise:The Misunderstood Role of Pilot Error in Airline Accidents Key Dismukes NASA Ames Research Center Ben Berman and Loukia Loukopoulos San Jose State University Foundation at NASA Ames Research Center ALPA Air Safety Week 19 August 2004

  2. Approach • Reviewed NTSB reports of the 19 U.S. airline accidents between 1990-2000 attributed primarily to crew error. • Asked: Why might any airline crew in situation of accident crew be vulnerable to the same error?

  3. Hindsight Bias • Knowing the outcome of an accident flight reveals what crew should have done differently. • Accident crew does not know the outcome. • They respond to situation as they perceive it at the moment.

  4. Immediate demands of situation & tasks being performed • Social/Organizational • Influences • Formal procedures & • policies • Explicit goals & rewards • Implicit goals & rewards Actual “norms” for line operations Training, experience, & personal goals Human cognition characteristics & limitations Crew responses to situation

  5. A Truism • No one thing “causes” accidents. • Confluence of multiple events, actions taken or not taken, and environmental factors.

  6. Confluence of Factors in a CFIT Accident Approach controller failed to update altimeter setting Training & Standardization issues? Weather conditions Non-precision approach ≥ 250 foot terrain clearance Rapid change in barometric pressure Strong crosswind Tower window broke Autopilot would not hold PF used Altitude Hold to capture MDA PM used non-standard callouts to alert PF Are most pilots aware of this? Tower closed PF selected Heading Select Altimeter update not available Altitude Hold may allow altitude sag 130 feet in turbulence Airline’s use of QFE altimetry Additional workload ? Increased vulnerability to error ? Crew error (70 feet) in altimeter setting 170 foot error in altimeter reading Aircraft struck trees 310 feet below MDA

  7. Chance Combination of Contributing Factors • Airline accidents are extremely rare in modern operations • Countermeasures in place for single-point failures of equipment or human performance • Occasionally accidents slip through multiple defenses because of chance combination of multiple factors • Number of possible combinations/permutations of factors is vast • Difficult to devise countermeasures • Vague advice to pilots to “break the accident chain” is not helpful

  8. Six Overlapping Clusters of Error Patterns • Inadvertent slips and oversights while performing highly practiced tasks under normal conditions • Inadvertent slips and oversights while performing highly practiced tasks under challenging conditions • Inadequate execution of non-normal procedures under challenging conditions • Inadequate response to rare situations for which pilots are not trained • Judgment in ambiguous situations • Deviation from explicit guidance or SOP

  9. 1) Inadvertent Slips/Oversights in Practiced Tasks under Normal Conditions • Examples: • Omitting procedural step or checklist item • Remembering altimeter setting incorrectly • Misjudging landing flare • Identical to errors pilots frequently report to ASRS and ASAP and errors observed in LOSA • Commonplace error had to combine with multiple other factors to result in accident

  10. 2) Inadvertent Slips/Oversights in Practiced Tasks under Challenging Conditions • Probability of commonplace errors goes up with workload, time pressure, fatigue and stress • Snowball effects: events/decisions/actions increase workload, time pressure, and stress downstream, increasing chance of more problems and errors

  11. 3) Inadequate Execution of Non-normal Procedures under Challenging Conditions • Failure to recover from spiral dive, stall, or windshear • Veridian study suggests existing training not sufficient • Surprise, confusion, and stress may impede correct diagnosis of upset and timely execution of appropriate procedure

  12. 4) Inadequate Response to Rare Situations for which Pilots are not Trained • Examples: • False stickshaker activation just after rotation • Oversensitive autopilot drove aircraft down at Decision Height • Anomalous airspeed indications past rotation speed • Uncommanded autothrottle disconnect with non-salient annunciation • Surprise, confusion, stress, and time pressure play a role • No data on what percentage of airline pilots would respond adequately in these situations

  13. 5) Judgment and Decision-making in Ambiguous Situations • Examples: • Continuing approach in vicinity of thunderstorms • Not de-icing or not repeating de-icing • No algorithm to calculate when to break off approach; company guidance usually generic • Crew must integrate incomplete and fragmentary information and make best judgment • If guess wrong, crew error is found to be “cause” • Accident crew judgment & decision-making may not differ from non-accident crews in similar situations: • Lincoln Lab study: Penetration of storm cells on approach not uncommon • Other flights may have landed or taken off without difficulty a minute or two before accident flight • Questions: • What are actual industry norms for these operations? • Sufficient guidance for crews to balance competing goals? • Implicitly tolerate/encourage less conservative behavior as long as crews get by with it?

  14. 6) Deviation from Explicit Guidance or SOP • Example: Attempting to land from unstabilized approach resulting from slam-dunk approach • Simple willful violation or more complex issue? • Are stabilized approach criteria published/trained as guidance or absolute bottom lines? • What are norms in company and the industry? • Pilots may not realize that struggling to stabilize approach before touchdown imposes such workload that they cannot evaluate whether landing will work out

  15. Cross-Cutting Factors Contributing to Crew Errors

  16. Situations Requiring Rapid Response • Nearly 2/3 of 19 accidents • Examples: upset attitudes, false stickshaker activation after rotation, anomalous airspeed indications at rotation, autopilot-induced oscillation at Decision Height, pilot-induced oscillation during flare • Very rare occurrences, but high risk • Surprise is a factor • Inadequate time to think though situation • automatic response required

  17. Challenges of Managing Concurrent Tasks • A factor in great majority of these accidents • Workload quite high in some accidents but in most time was available to perform all tasks • Especially vulnerable to error when switching attention among tasks, interrupted, distracted, or forced to defer tasks out of normal sequence • Vulnerability inherent in basic cognitive processes: • Can attend to only one distinct task at a given instant • Once attention is diverted from a task do not always remember to resume task if not prompted • Better monitoring can help prevent/catch errors • But monitoring is itself a concurrent task and vulnerable to the same factors that produce errors

  18. Equipment Failures and Design Flaws • Occurred in 2/3 of these accidents • Some equipment failures/design flaws precipitated chain of events Example: false stickshaker after rotation • Some equipment failures/design flaws undermined efforts of crew to respond Example: stickshaker failed to activate when aircraft approached stall

  19. Stress • Hard to evaluate extent, but stress is normal physiological/behavioral response to threat • Acute stress hampers performance • Narrows attention (“tunneling”) • Reduces working memory capacity • Combination of surprise, stress, time pressure, and concurrent task demands can be lethal setup • Beginning NASA research project on effects of stress on crew performance

  20. Shortcomings in Training and/or Guidance • >1/3 of accidents • Inadequate guidance to pilots about known problems (e.g. high sensitivity of wings without leading edge devices to minute amounts of frost) • Upset attitude recovery training • How to deal with fact that not possible to train for every possible situation?

  21. Plan Continuation Bias • Unconscious cognitive bias to continue original plan in spite of changing conditions • Appears stronger as one nears completion of activity (e.g., approach to landing) • Bias may prevent noticing subtle cues indicating original conditions have changed • May combine with other cognitive biases: • frequency sampling bias (“It’s always worked before”) • reactive responding is easier than proactive thinking

  22. Social/Organizational Issues • Actual norms may deviate from Flight Operating Manual • Little data available on extent to which accident crews’ actions are typical/atypical • Little study or acknowledgment of competing pressures • e.g., on-time performance vs. conservative response to ambiguous situations • Pilots may not be consciously aware of influence of internalized competing goals

  23. Countermeasures? No Easy Solutions • Recognize that most accidents are systems accidents • Identify hidden vulnerabilities of systems • Pilots, managers, designers of equipment & procedures should be well educated about human cognitive characteristics & limitations • Review procedures to insure they do not contribute to inherent cognitive vulnerabilities • Insist on conservative hard-bottom lines in critical situations • Need better info on how airspace system typically operates and how crews respond • LOSA, FOQA, ASAP, NOAMS • Beef up training • upset attitude recovery and other rapid response situations • monitoring • Acknowledge inherent trade-offs between safety and system efficiency • Include all parties in analysis of trade-offs • Make policy decisions explicit

  24. Countermeasures? No Easy Solutions • Recognize that most accidents are systems accidents • Identify hidden vulnerabilities of systems • Pilots, managers, designers of equipment & procedures should be well educated about human cognitive characteristics & limitations • Review procedures to insure they do not contribute to inherent cognitive vulnerabilities • Insist on conservative hard-bottom lines in critical situations • Need better info on how airspace system typically operates and how crews respond • LOSA, FOQA, ASAP, NOAMS • Beef up training • upset attitude recovery and other rapid response situations • monitoring • Acknowledge inherent trade-offs between safety and system efficiency • Include all parties in analysis of trade-offs • Make policy decisions explicit

  25. Countermeasures? No Easy Solutions • Recognize that most accidents are systems accidents • Identify hidden vulnerabilities of systems • Pilots, managers, designers of equipment & procedures should be well educated about human cognitive characteristics & limitations • Review procedures to insure they do not contribute to inherent cognitive vulnerabilities • Insist on conservative hard-bottom lines in critical situations • Need better info on how airspace system typically operates and how crews respond • LOSA, FOQA, ASAP, NOAMS • Beef up training • upset attitude recovery and other rapid response situations • monitoring • Acknowledge inherent trade-offs between safety and system efficiency • Include all parties in analysis of trade-offs • Make policy decisions explicit

More Related