Presentation on theme: "Part 1: Learning from Unexpected Events"— Presentation transcript:
1 Part 1: Learning from Unexpected Events Michael P. Silver, MPHDirector, Scientific Affairs and Patient SafetyHealthInsight
2 “Every system is perfectly designed to get the results it achieves” The Design Challenge“Every system is perfectly designed to get the results it achieves” Benefits and harm are designed into health care systems
3 Design of health care systems and processes Elements configured by designers include:People – education, training, orientation, …Materials – medications, supplies, …Tools – medical equipment, information technology, forms, communication media, …Methods – procedures, diagnostic and treatment processes, management practices, policies, communications practices, coordination of effort, …
4 Sources of design failure in complex systems Design flaws are expected because (for example):Actual operations are more complex than our design modelsSystem elements interact in unexpected waysProcedures, tools, and materials are used in ways not anticipatedMultiple designers with potentially different goals and assumptionsSafety features, defenses become degraded over timeEnvironmental conditions, expectations, and demands change over time
5 The world points out our design flaws to us In the course of actual operations, design flaws will produce:Errors, unsafe acts, procedure violationsGlitchesNear-missesAccidentsInjurySentinel events/catastrophes(We may also learn from other people’s failures)
6 We have a hard time listening to the world! Victims of (apparent) successWe may not hear about many failures (especially “small” ones) or recognize them as associated with our decisionsDesigns work most of the timeDedicated staff negotiates hazards, improvises, and complete the design for usBecause of this, and other biases, failures and accidents may be understood as the product of individual failures rather than design flawsDifficult to attend to all of the lessons availableRush to closureReview focused on immediate causes/ reluctance to look deeperThere’s always something else that seems more pressing
7 Brown, Clark, Anderson, Ramon A lesson from the Columbia accidentBrown, Clark, Anderson, RamonHusband, Chawla, McCool
10 Was foam insulation supposed to fall off the external fuel tank? No! (of course not)Yet, it didFoam loss had occurred on about 80% of previous missionsA recognized, major source of damage to thermal protection tilesA routine occurrenceLarge pieces of foam had detached from the left bipod ramp in approximately 10% of previous missionsOnly two missions previous, a large piece of foam from the left bipod ramp impacted a ring that attaches the solid rocket boosters to the external fuel tankDuring the Columbia mission, the foam strike was considered to be a significant maintenance issue, but not a mission safety issue
11 On-Line Exercise: You’re the Teacher Identify an example from your clinical experience* that can be used to illustrate the “drift toward failure” observed in the Columbia disasterWarning signs observed, warning signs ignoredNormalization of deviancePast successes used as evidence of future success*If you have no such experience, develop a plan to connect with clinician(s) to identify examples.
12 Incident Investigation/RCA – In context Event detectionEvent reportingIncident investigation/ RCAImproved understanding of system/ processesEffective solutions/ improved designsIncreased safetySafety CultureReporting cultureAssumptions about the meaning of error and accidentsDemonstrated organizational value and commitment to safetyProspective (organizational) accountability“Just” response to error, unsafe acts, and accidentsOrganizational learning
13 Event ReportingIn order to learn from unexpected events, we must first learn of themSentinel eventsPatient harmNo-harm eventsNear missesUnsafe acts, errorshazardous conditionsaccidents waiting to happenLesser events not only provide opportunities for learning, but by their sheer volume represent substantial waste, frustration, and re-work.
14 Off-Line Assessment: Event reporting in your facility Gather a team to review how effectively event reporting is supportedDoes reporting place undue burdens on reporters?Have expectations for reporting been clearly communicated?“No harm, no report”?Consistent message from supervisors?“Not our shift/not our department”?How do staff know that there will be a “just” response to events identifiedHow do we provide feedback to reporters (both immediate and in terms of actions taken)?
15 Learning from Recovery/Mitigation System reliability and safety can be improved by:reducing failures, errors, and unsafe actsincreasing the likelihood that these are detected and prevented from propagatingbothAlso promotes a better understanding and appreciation for defenses
16 Part 1 Summary Incident investigation and root cause analysis Central to ongoing system design processDifficult to do wellDepends on and reinforces event reportingIs an outgrowth of and partially defines organizational safety cultureIs a key process of safety management
18 Why event investigation is difficult Natural reactions to failureTendency to stop too soonOverconfidence in our re-constructed reality“The root cause” myth
19 Our reactions to failure Typical reactions to failure are:Retrospective—hindsight biasProximal—focus on the “sharp end”Counterfactual—lay out what people could have doneJudgmental—determine what people should have done, the fundamental attribution error
20 On-Line Exercise: “any good nurse …” In the course of event investigations and RCA you can expect to encounter the “any good nurse …” reaction.Describe how this might negatively impact the investigation processHow can you anticipate and/or respond to this reaction?
21 Stopping too soon Lack training in event investigation We don’t ask enough questionsShallow understanding of the causes of eventsLack resources and commitment to thorough investigations
22 Overconfidence in our re-constructed reality People perceive events differentlyCommon sense is an illusionUnique sensesUnique knowledgeUnique conclusions
23 The “the root cause” myth There are multiple causes to accidentsRoot cause analysis (RCA) is not about finding the one root cause
24 The “New View” of human error Human error is not the cause of events, it is a symptom of deeper troubles in the systemHuman error is not the conclusion of an investigation, it is the beginningEvents are the result of multiple causes
29 Creating the holes Active Failures Latent conditions Errors and violations (unsafe acts) committed at the “sharp end” of the systemHave direct and immediate impact on safety, with potentially harmful effectsLatent conditionsPresent in all systems for long periods of timeIncrease likelihood of active failures
30 “Latent conditions are present in all systems “Latent conditions are present in all systems. They are an inevitable part of organizational life.”James Reason“Managing the Risks of Organizational Accidents”
31 Root CausesA root cause is typically a finding related to a process or system that has potential for redesign to reduce riskActive failures are rarely root causesLatent conditions over which we have control are often root causes
32 On-Line Exercise: The failed RCA The evidence suggests that, currently, most RCAs conducted in health care are ineffective.How would you know that an RCA had failed?/ What are the characteristics of a failed RCA?(Off-line)What RCA practices and procedures do you think would be likely to produce a failed RCA?
33 On investigating human error “The point of a human error investigation is to understand why actions and assessments that are now controversial, made sense to people at the time. You have to push on people’s mistakes until they make sense—relentlessly.”Sidney Dekker
34 Getting Inside the Tunnel Possibility 2Possibility 1Actual OutcomeScreen Beans®
35 Outside the Tunnel Inside the Tunnel Outcome determines culpability “Look at this! It should have been so clear!”We judge people for what they didInside the TunnelQuality of decisions not determined by outcomeRealize evidence does not arrive as revelationsRefrain from judging people for errors
36 Lessons from the Tunnel We haven’t fully understood an event if we don’t see the actors’ actions as reasonable.The point of a human error investigation is to understand why people did what they did, not to judge them for what they did not do.
37 Summary New view of human error Events are the result of many causes Active failures and latent conditions create holes in our system’s defensesRoot cause are causes with potential for redesign to reduce riskActive failures are rarely root causes, latent conditions are often root causesGetting inside the tunnel will help us understand why events occur
38 Questions? Comments? References Dekker, S. The Field Guide to Human Error Investigations. Burlington, VT: Ashgate, 2002.Gano DL. Apollo Root Cause Analysis: A New Way of Thinking. Yakima, WA: Apollonian Publications(last accessed 7/16/06)Reason J. Managing the Risks of Organizational Accidents. Brookfield, VT: Ashgate, 1997.
Your consent to our cookies if you continue to use this website.