Presentation is loading. Please wait.

Presentation is loading. Please wait.

Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 1 Why Do Good Pilots Make Errors That.

Similar presentations


Presentation on theme: "Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 1 Why Do Good Pilots Make Errors That."— Presentation transcript:

1 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 1 Why Do Good Pilots Make Errors That Sometimes Lead to Accidents? Key Dismukes, Ph.D. Annual MMOPA Convention 28 October 2010

2 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 2 Most Accidents Are Attributed to Pilot Error How should we think of this? Why do experienced pilots make mistakes performing routine tasks? - Lack of skill? Carelessness? Bad attitudes? Some other, more subtle answer? Are errors inevitable? - If so, what can we do about them? - What is the relationship between errors and accidents? How we answer these questions is crucial to aviation safety

3 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 3 Research on Pilot Performance Has focused mostly on airline operations - Extensive data - Limited funding for GA safety Extracted findings relevant to your type of flying - Will not discuss errors made by novices - GA operations can be more challenging than airlines’ Will not discuss Darwin award accidents Focus on challenges faced by experienced, conscientious pilots

4 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 4 Overview of Talk Research community’s perspective on why all skilled operators (not just pilots) are vulnerable to error Describe specific situations in which vulnerability to error is high Practical countermeasures pilots can take to reduce vulnerability and to manage consequences of error

5 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 5 Consensus from Decades of Human Factors Research Simply naming human error as “cause” is simplistic - Does little to further understanding and prevent future accidents Must avoid hindsight bias Absent willful misconduct, “Blame and punish mentality blocks path to improving safety Path to accidents driven by interactions among multiple factors

6 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 6 Conditions (e.g., weather) Events Equipment and interface design Task Demands Inherent characteristics and limitations of human perception and cognition Individual Factors: goals technical & interpersonal skills experience and currency physiological state attitudes Social/organizational Factors: community norms regulations/enforcement passenger expectations Skilled performance (both effective and ineffective)

7 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 7 Confluence of Factors in a CFIT Accident (Bradley, 1995) Non-precision approach ≥ 250 foot terrain clearance Are most pilots aware of this? Weather conditions Airline’s use of QFE altimetry Strong crosswind Autopilot would not hold PF selected Heading Select Additional workload Increased vulnerability to error Crew error (70 feet) in altimeter setting Altimeter update not available 170 foot error in altimeter reading Tower closed Tower window broke Rapid change in barometric pressure Approach controller failed to update altimeter setting Altitude Hold may allow altitude sag 130 feet in turbulence PF used Altitude Hold to capture MDA PM used non-standard callouts to alert PF Training & Standardization issues? Aircraft struck trees 310 feet below MDA ? ?

8 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 8 How Can We Prevent Multiple Factors from Converging to Cause Accidents? Must look for underlying themes and recurring patterns Must develop tools to help individuals and organizations recognize nature of vulnerability

9 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 9 Some Major Themes and Recurring Patterns (not an exhaustive list) Plan continuation bias Snowballing workload “Multitasking”: Concurrent task demands and prospective memory failures Ambiguous situations without sufficient information to determine best course of action Procedural drift Situations requiring very rapid response Organizational issues

10 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 10 Some Major Themes and Recurring Patterns (not an exhaustive list) Plan continuation bias Snowballing workload “Multitasking”: Concurrent task demands and prospective memory failures Ambiguous situations without sufficient information to determine best course of action Procedural drift Situations requiring very rapid response Organizational issues

11 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 11 Plan Continuation Bias Major themes/patterns Tendency to continue original or habitual plan of action even when conditions change Not just “Get-there-itis” Operates sub-consciously Pilot fails to step back and re-assess situation and revise plan

12 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 12 Example: Flight 1420 DFW to Little Rock 2240: Departed DFW over two hours late 2254: Dispatch: Thunderstorms left and right but LIT clear; suggest expedite approach Crew concluded (from radar) cells were about 15 miles from LIT and they had time to land Typical airline practice to weave around cells -Hold or divert if necessary but usually land Crews are expected to use best judgment with only general guidance Plan continuation bias

13 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 13 Flight 1420 (continued) 2234 to 2350 (landing): Crew received series of wind reports – Wind strength/direction varied, with worsening trend – Crew discussed whether legal to land (tactical issue), but not whether to continue the approach (strategic issue) 2339:32: Controller reported wind shift: now 330/11 2339:45: Controller reported wind-shear alert: Center field 340 at 10; North boundary: 330 at 25; Northwest boundary: 010 at 15 – Alert contained 9 separate chunks of information –Average human working memory limit is 7 chunks Plan continuation bias

14 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 14 Flight 1420 (continued) Crew requested runway change from 22L to 4R for better alignment with new wind. --Flight vectored around for new visual approach to 4R Vectoring turned aircraft radar antenna away from airport – Crew could not observe airport on radar for 7 minutes Crew’s response to wind reports was to try to expedite visual approach to beat the storm 2344: Crew lost visual contact and requested vectors for ILS 4R – Vectors took aircraft deeper into storm –Crew requested tight approach, increasing time pressure Plan continuation bias

15 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 15 Flight 1420 (continued) By now crew was extremely busy, tired at the end of long duty day, and in a difficult, stressful situation 2347: New weather report: RVR 3000; wind 350 at 30G45 Plan continuation bias - FO read back incorrectly as 030 at 45 (which would have been within crosswind limits) - Controller failed to catch incorrect readback (hearback often fails)

16 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 16 Flight 1420 (continued) 2347:44: Captain: “Landing gear down” - Sixth of 10 items on Before Landing checklist - FO lowers landing gear Distracted, FO forgot to arm ground spoilers and other remaining checklist items - Captain failed to notice omission Crew was extremely busy for 2 & ½ minutes from lowering gear to touchdown Fatigue: Awake 16 hours and on dark side of clock Stress, normal response to threat, but: - Narrows attention, preempts working memory Combination of overload, fatigue, and stress impairs crew performance drastically Plan continuation bias

17 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 17 Flight 1420 (continued) Overloaded, captain forgot to call for final flaps but was reminded by FO Lost sight of runway and reacquired just above DH; unstabilized in alignment and sink rate - Company had not established explicit policy requiring go-around - Either landing or go-around would be in middle of thunderstorm 2350:20: Aircraft touched down right of centerline - Veered right and left up to 16 degrees before departing runway Unarmed spoilers did not deploy Captain used normal reverse thrust—1.6 EPR - Limited to 1.3 EPR on wet runways to limit rudder blanking 2350:44: Crashed into structure at departure end of runway - Aircraft destroyed; 10 killed, many injured Plan continuation bias

18 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 18 Flight 1420 (conclusion) Plan continuation bias Many factors and many striking features (much detail omitted) Crew responded to events as they happened, trying to manage, but: - Never discussed abandoning the approach - Striking example of plan continuation bias Experts in all domains are vulnerable to plan continuation bias What causes this vulnerability? - Still under research; multiple factors probably contribute

19 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 19 Plan Continuation Bias--Likely (Unconscious) Factors Habitual plan has always worked in past (e.g., threading around storm cells) - MIT study: T-Storm penetration common on approach - Leads to inaccurate mental model of level of risk Information often incomplete or ambiguous and arrives piecemeal - Difficult to integrate under high workload, time pressure, stress, or fatigue Many situations are not clear-cut; rather, gradual deterioration Expectation bias makes us less sensitive to subtle cues that situation has changed Competing goals: Safety versus mission success, passenger satisfaction Framing bias influences how we respond to choices

20 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 20 Under high workload our cognitive resources are fully occupied with immediate demands No resources left over to ask critical questions Forced to shed some tasks, individuals often become reactive rather than proactive - React to each new event rather than thinking ahead strategically As situation deteriorates, we experience stress: - Compounds situation by narrowing attention and pre-empting working memory Catch-22: High workload makes it more difficult to manage workload - By default, continue original plan, further increasing workload - When we most need to be strategic we are least able to be strategic Snowballing Workload

21 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 21 Prospective Memory Lapses PM: The cognitive processes involved in remembering (and sometimes forgetting) to do things we intend to do later In 5 of 27 major U.S. airline accidents attributed to crew error, inadvertent omission of procedural step played a central role: - Forgetting to set flaps/slats, to set hydraulic boost pumps to high, to turn on pitot heat before takeoff, to arm spoilers before landing GA pilots often report inadvertent omissions to ASRS Insurance for GA retractable-gear 40% higher than fixed gear Why would highly experienced pilots, controllers, mechanics and other operators forget to perform simple, routine tasks? Another NASA study: The Multitasking Myth: Handling Complexity in Real-World Operations

22 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 22 Six Prototypical Situations for Forgetting Tasks 1) Interruptions—forgetting to resume task after interruption over 2) Removal of normal cue to trigger habitual task, e.g.: - “Monitor my frequency, go to tower at FAF” - Consequence: Landing without clearance 3) Habitual task performed out of normal sequence. e.g.: - Deferring landing gear until later in the approach 4) Habit capture—atypical action substituted for habitual action - Example: Modified standard instrument departure 5) Non-habitual task that must be deferred - “Report passing through 10,000 feet” 6) Attention switching among multiple concurrent tasks - Examples: Glitch in GPS data entry; cell phone use while driving

23 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 23 Glass Cockpits Help in Some Ways But Not Others Enhance situation awareness & planning (e.g. nav displays) Off-load workload once programmed, but: - Increase workload if used unwisely (e.g., reprogramming revised landing clearance during approach) - Increase monitoring demands (humans are poor monitors) Pilots are vulnerable to automation complacency - Automation actions & logic are not transparent to pilot Pilot-automation interfaces are improving, but: - Programming & data entry impose substantial memory load Currency & proficiency are critically important - Readily maintained by full-time pilots, not so easy for less frequent flyers

24 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 24 Checklists Are a Major Defense (not a panacea) Vulnerable to same sorts of errors previously discussed NASA study of checklist use by airline pilots - Two findings especially relevant to GA Checklist items sometimes inadvertently skipped - When interrupted, distracted, or multitasking “Looking without seeing” (item is mis-set) - Expectation bias - Running on automatic (visual gaze on item too brief)

25 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 25 Carelessness??? Research: Expert human operators in every domain are vulnerable to these types of errors - (Even after eliminating Darwin award winners) Human brains not wired to be completely reliable in these prototypical situations - Cannot completely eliminate error Good news: We can reduce vulnerability through countermeasures (threat & error management) - Ways to reduce number of errors - Ways to catch errors to prevent consequences

26 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 26 Systematic Risk Assessment Countermeasures Must identify risks in order to manage them - Create strategies in advance for addressing threats - Causes “priming” (faster retrieval from memory) Two types of risk: generic and specific to today’s flight Go beyond standard flight planning: - Identify our assumptions (e.g., 15 kt headwind) - Ask “What if my assumptions are wrong? (sensitizes us to subtle cues in flight) Counteract plan continuation bias - “Is this panning out the way I planned?” - “How will I know it may not work out? Will I know in time?” - Identify bottom lines in advance (e.g. bingo fuel) - Have a back door

27 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 27 Workload Management Countermeasures Be on guard for effects of snowballing workload - Recognize symptoms: rushing, anxiety, becoming reactive Proactively manage workload: - Buy time (e.g., holding, vectors) - Shed lower priority tasks - Use available resources (e.g., autopilot, ATC, passengers) Do not allow yourself to rush - Rushing saves few seconds, increases errors enormously Force yourself to take time for strategic assessment Recognize vulnerability to forgetting critical tasks - Always present, no matter how experienced in aircraft - Increases under workload, stress, interruptions, distractions

28 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 28 Preventing Prospective Memory Lapses Countermeasures Identify where and when you will complete deferred tasks - Also interrupted tasks and tasks out of normal sequence - Say it out loud - Ask person in right seat to help you remember Pause to review when transitioning between flight phases - For example: Before taking runway, at top of descent - Have I forgotten anything? - What do I need to remember coming up? Create reminder cues. Best if: - Visually distinctive, unusual and/or physically impede next action - For example, put suspended checklist in throttle quadrant

29 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 29 Protect Checklist Performance Countermeasures Recognize vulnerability to skipping items, misperceiving items - No matter how experienced in aircraft Slow down, be deliberate Point to or touch each item checked Annunciate checklist responses out loud - Wait until after pointing to/touching item checked Deliberate approach requires discipline - Goes against the natural grain of fast, fluid response - Substantially reduces errors

30 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 30 Passenger Briefing Countermeasures My experience: Most vulnerable to error with passenger I have not flown with before: - Passenger may interrupt me at crucial moments - I divert my attention explaining operations With good briefing passenger can be a useful resource Start with standard briefing: - Airplane features, emergency egress, event sequence, etc Go beyond standard briefing: - Sterile cockpit, watch for traffic, manage charts, prospective memory aid, etc

31 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 31 The BFR is Not Enough! Countermeasures Insurance companies often require more frequent recurrent training for advanced aircraft Must go beyond normal and emergency procedures Need opportunity to practice judgment and decision-making in diverse challenging situations Debriefing for lessons learned is essential - Instructors should facilitate pilot self-analysis (self debriefing) - We can debrief ourselves after every actual flight Thoughtful review of accident reports can help - What led the accident pilot into that trap? - How would I manage that situation?

32 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 32 Practice Judgment & Decision-Making Countermeasures NTSB cites problematic judgment & decision-making in most GA accidents We must be prepared for a wide range of situations we hope never to encounter - Without assistance of co-pilot or dispatch Military and airline pilots undergo frequent realistic simulator training for diverse challenging situations GA pilots could benefit by using desktop simulation in a similar way - Need the elements of surprise and dynamically unfolding situations - Desktop IFR simulators could be enhanced Even mental rehearsal alone is beneficial

33 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 33 A Pithy Summary Chief of USMC Aviation Safety: Fly Smart, Stay Half-Scared, and Always Have a Way Out

34 Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 34 More Information  Dismukes, Berman, & Loukopoulos (2007). The Limits of Expertise: Rethinking Pilot Error and the Causes of Airline Accidents (Ashgate Publishing)  Loukopoulos, Dismukes, & Barshi (2009). The Myth of Multitasking: Managing Complexity in Real-World Operations (Ashgate Publishing)  Berman, B. A. & Dismukes, R. K. (2006). Pressing the Approach: A NASA Study of 19 Recent Accidents Yields a New Perspective on Pilot Error. Aviation Safety World, 28-33.  Can download papers from: http://human-factors.arc.nasa.gov/ihs/flightcognition/  This research was funded by the NASA Aviation Safety Program and the FAA


Download ppt "Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 1 Why Do Good Pilots Make Errors That."

Similar presentations


Ads by Google