Safety-I, Safety-II, And the Messy Details of Clinical Work Robert L Wears, MD, MS, PhD University of Florida Imperial College London International System.

Slides:



Advertisements
Similar presentations
MCIC Perioperative Initiative February 14, 2006 Operating Room Briefings.
Advertisements

The Assumptions of a Culture-centered Perspective:
J. David Tàbara Institute of Environmental Science and Technology Autonomous University of Barcelona Integrated Climate Governance.
Carper (1978) Fundamental patterns of knowing
A small taste of inferential statistics
SETTINGS AS COMPLEX ADAPTIVE SYSTEMS AN INTRODUCTION TO COMPLEXITY SCIENCE FOR HEALTH PROMOTION PROFESSIONALS Nastaran Keshavarz Mohammadi Don Nutbeam,
ROSIS - Working Towards Safer Healthcare Delivery
Moving from Safety-I to Safety-II
Volunteering and ageing: Pathways into social inclusion in later life Jeni Warburton John Richards Chair of Rural Aged Care Research La Trobe University,
Human Performance and Patient Safety
Brownfields 2013 Ron Snyder, HMTRI/CCCHST Adapted from: Todd Conklin PhD Los Alamos National Laboratory.
Perinatal Patient Safety An interdisciplinary approach to improve performance J. Chris Carey MD Director, Obstetrics and Gynecology, Denver Health Professor,
Loughborough Design School Is It Raining Out There? An exploration of culture, climate and safety. Dr Mike Fray.
The Future of Patient Safety Seeing safety through the patient’s eyes Rene Amalberti & Charles Vincent Department of Experimental Psychology, Nuffield.
Understanding systems and the impact of complexity on patient care
Why do people make mistakes? Learning Lite
Software Dependability CIS 376 Bruce R. Maxim UM-Dearborn.
CHALLENGES AND OPPORTUNITIES FOR CRITICAL ANALYSIS IN ASSESSMENT.
The Measurement and Monitoring of Safety: Drawing together academic evidence and practical experience to produce a framework for safety measurement and.
Learning about Safe Systems Dr. Maureen Baker CBE DM FRCGP Clinical Director for Patient Safety NHS Connecting for Health.
Thinking Actively in a Social Context T A S C.
Dealing with underperforming staff Planning for action and managing self.
Sustaining Change in a Changing World Jay Ford, PhD Assistant Scientist.
The Field Guide to Human Error Investigations- The Old View (Chapters 1 – 6) By Dekker AST 425.
Topic 5 Understanding and learning from error. LEARNING OBJECTIVE Understand the nature of error and how health care can learn from error to improve patient.
Patient Safety Workforce Training Susan Carr Editor Patient Safety and Quality Healthcare Primary researcher and writer Train for Patient Safety Quality.
Problem with the DSM: It highlights or exaggerates differences between the diagnosed and the undiagnosed A possible alternative to the DSM would be a system.
Introduction to Simulation Modelling Chapter1. Models as convenient worlds  Our lives, as individuals or families or workers in an organization, are.
Putting Patients First ‘Championing Consumers’ Rights’ Tania Thomas Deputy Health and Disability Commissioner April 2007.
© 2009 On the CUSP: STOP BSI Identifying Hazards.
Reliability Principles CQN Asthma Project January 14, 2010.
Actors & Structures in Foreign Policy Analysis January 23, 2014.
January 17, 2003John Roberts, Stanford GSB1 Stanford GSB Sloan Program Stramgt 258 Strategy and Organization 4. Organization Design for Performance BP.
Knowledge into Action: supporting education and learning Host: Derek Boyle Senior Knowledge Manager, NHS Education for Scotland
Problem Solving / Decision MakingChapter Problem Solving / Decision Making Kepner-Tregoe The New Rational Manager Chapter 1.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
The six factors are closer to real life organizational dynamics demonstrate the complexities of routines of real life. 1. UN ANTICIPATED CONSEQUENCES OF.
Hazard Identification
AN INTRODUCTION Managing Change in Healthcare IT Implementations Sherrilynne Fuller, Center for Public Health Informatics School of Public Health, University.
Topic 3 Understanding systems and the impact of complexity on patient care.
Introduction to Hazards Risk Management
On the CUSP: STOP BSI Improving Situational Awareness by Conducting a Morning Briefing.
Unit 9 Maintaining High Reliability and Implementation.
Introduction to Work and Organizational Psychology Gerhard Ohrband 11 th lecture Safety at work.
On the CUSP: STOP BSI Improving Situational Awareness by Conducting a Morning Briefing.
Implementing Strategy Chapter 7. Objectives Upon completion of this chapter, you should be able to:  Translate strategic thought to organisational action.
Q. Characteristics of the Situation “When you’ve exhausted all possibilities, remember this: You haven’t!” ~Robert H. Schuller Chapter 11.
Introductions O A warm welcome to all Comenius partners from the British team: O Andy Marshall.
Risk Identification. Hazards and Risk Section 2: ACCIDENT THEORIES 2.1 Single Factor Theories  This theory stems from the assumption that an accident.
Working With The Adults In Children’s Lives Compassion, Curiosity and Courage.
‘The right healthcare, for you, with you, near you’ Commissioning for Quality Deborah Fielding, Accountable Officer NHS Wiltshire CCG November 11 th 2015.
Enclosed Spaces – the human dimension Marc Williams Human Element Policy Manager MCA.
PATIENT SAFETY AND QUALITY IMPROVEMENT EDUCATIONAL STRATEGY.
Human & Organizational Performance – H.O.P.
Centre for Applied Resilience in Healthcare Organisational resilience and patient safety Dr. Janet Anderson Centre for Applied Resilience in Healthcare.
Kate Perkins for the Ithaca Group. Setting the scene  Where has the CSfW come from?  What is it for? Who is it for? The framework  Skill Areas  Developmental.
Sensemanaging: A Journey to Understand What I Do Gina Hinrichs, Ph.D.
Post Fellowship Skills Course
Overview for Placement
Introduction to Simulation Modelling
From counting errors to creating organisational resilience in managing safety Matita Tshabalala.
The elephant in the room. John Wattis University of Huddersfield
The creativity of Italian Educational System
High reliability & Practice Transformation
College of NURSING SCIENCE
Patient Safety and Quality care Movement
ORGAnisational resilience analysis introduction
Improving patient safety and care: Evidence from inspections
Presentation transcript:

Safety-I, Safety-II, And the Messy Details of Clinical Work Robert L Wears, MD, MS, PhD University of Florida Imperial College London International System Safety Society 8 October 2015

apologia and cautions background in healthcare almost exclusively the ER trying to overcome my background as a doctor … 2

2 important differences organic vs engineered systems 3

2 important differences irreducible ambiguity 4

motivation general agreement that we are not making progress on safety as fast as we would like what’s typically being said … we have not been ‘Protestant enough’ more rigour (eg, EBM) greater accountability ‘just do it harder’ 5

motivation general agreement that we are not making progress on safety as fast as we would like what’s not being said wrong mental model of safety – utopian scientism “… enduring Enlightenment projects “… rationality can create a better, more controllable world “… taken for granted by safety researchers b/ it appears so ordinary, self-evident and commonsensical.”* 6 *Dekker 2012

patient safety orthodoxy technocratic, instrumental, ‘measure- and-manage’ approach myopic – failing to question underlying nature of problems overly simplistic – transferring sol’ns from other sectors negligent of knock-on effects of change ‘amateur social science’ “glosses over the complexities of health care organisation and delivery” 7

a missed opportunity clinical expertise necessary but not sufficient for safety “‘errors’ in medicine, and the adverse events that may follow, are problems of psychology and engineering, not of medicine” - J Senders, 1994 needed to partner with ‘safety sciences’ psychology, human factors engineering, social science, communication, etc but instead got ‘scientific-bureaucratic medicine’ managerial rationalism wearing the mantle of science ‘the safety Nazis’ ‘we have ways of making you safe…’ 8

safety is a ‘wicked problem’ 9 “… it is far harder to make progress on safety than we thought … the programmatic approaches (checklists, team training, reporting) are all quite positive about the effects of their interventions but the experience we have when trying to apply those approaches is uniformly unsatisfying. “… the factors that create ‘the safety problem’ are deeply embedded in the system of work [including all the incentives and organizational structures that surround and promote work] and these programs don’t alter these factors. The system we have is a product of numerous compromises and sacrifices that are needed to “make things work” and the deep system that results is far more anchored and grounded than we appreciate. “… Instead, we have chosen to do things that give the appearance of improving safety so that we can feel better … these programs make it easier for us all to live with deeply flawed, dangerous systems.

safety is a ‘wicked problem’ 10 “This explains why we have so many programs for safety: we embrace a program to make ourselves feel better about the system of work. This does make us feel better, for a while. But eventually the deep system demonstrates in clear, unambiguous fashion, that we haven’t made real progress. Instead of taking this as evidence that we have fundamentally misunderstood what is going on, we conclude that we chose the wrong program and look for another one to restore our sense that we are making progress on safety. “To be sure there are real advances. Our technology, knowledge, and skill are constantly improving. But we choose to exploit these advances to accomplish more or to spend less rather than to make the work itself safer. We struggle to do this in a ’safety neutral’ way — ie, trying to keep the bad outcomes at about the same level as before while benefitting from the improvements — but this is always a process of discovery because the forms of failure are constantly changing.” - R I Cook, 2014

limits of the Enlightenment “good ideas that are nevertheless incorrect” - René Amalberti 11

simple models of accidents are delusions 12

simple models of accidents are delusions 13

complex adaptive systems distinguish between simple, complicated, and complex problems baking a cake landing on the moon raising a child little expertise required, highly standardized, formulaic solutions work many causes, many parts, break into simple problems & manage piece by piece complexity emerges from interaction of parts, can’t be decomposed, must deal with the whole 14

complex adaptive systems distinguish between simple, complicated, and complex problems taking vital signs placing a central line handing off a pt or unit little expertise required, highly standardized, formulaic solutions work many causes, many parts, break into simple problems & manage piece by piece complexity emerges from interaction of parts, can’t be decomposed, must deal with the whole 15

complex adaptive systems separating complicated and complex is essential placing a central line handing off a pt or unit particulars, context largely irrelevant paradigmatic mode of thinking particulars, situatedness, context are everything narrative mode of thinking 16

17

18

modern theories of accidents simple, linear, chain of events complicated, interdependent complex, nonlinear, coupling, resonance, emergence evolution of system safety

20

view from safety-I accidents come from erratic acts by people (variability, mistakes, errors, violations) study, count accidents to understand safety (tend to look backwards) focus on components safety is acquired by constraining workers via: standardisation, guidelines, procedures, rules, interlocks, checklists, barriers 21

assumptions in safety-I our systems are well-designed and well-understood procedures correct and complete systems are basically safe, well-protected reliability = predictable, invariant variation is the enemy safety is an attribute (something a system has) conditions are well-anticipated, well-specified 22

23

view from safety-II accidents are prevented by people adapting to conditions study normal work to understand safety (tends to look forward) focus on inter-relations aim is to manage, not eliminate, the unexpected safety is enacted by enabling workers via: making hazards, constraints, goal conflicts visible enhancing repertoire of responses 24

assumptions in safety-II our designs are incomplete, procedures out-dated our systems are poorly understood systems are basically unsafe reliability = responsiveness variation is necessary safety is an activity (something a system does) possible failure modes have not been anticipated ‘continuing expectation of surprise’ 25

safety-II 26 complex STS intractable, underspecified, variable demands resources (time, people, material, information) limited, uncertain workers adjust to meet conditions creating variability adjustments always approximate (b/ resources limited) approximate adjustments usually reach goals, make things go safely approximate adjustments sometimes fail, or make things go wrong “Knowledge and error flow from the same mental source; only success can tell one from another.” Ernst Mach, 1905

safety-I vs safety-II summary defined by its opposite - failure well designed & maintained, procedures correct & complete people (ought to) behave as expected & trained accidents come from variability in above therefore safety comes from limiting & constraining operators via standardization, procedures, rules, interlocks, barriers critical inquiry ‘work as imagined’ defined by its goal - success poorly understood, incomplete, underspecified people (ought to) adjust behaviour & interpret procedures accidents come from incomplete adaptation therefore safety comes from supporting operators via making boundaries, hazards, goal conflicts visible, enhancing repertoire of responses appreciative inquiry ‘work as done’

philosophical bases safety-I linear, proportional, tractable behaviour explained by reduction positivist, Taylorist cause-effect simple, oneway controllable ‘the one best way’ work as imagined values declarative, technical knowledge complicated problems techne, episteme safety-II non-linear, non-proportional, intractable behaviour explained by emergence constructivist, interpretivist cause-effect multiple, reciprocal influence-able equifinality, multifinality work as done values practice, tacit wisdom complex, ‘wicked problems’ mētis, phronesis 28

empirical support direct observations & NSQIP data surgeons w/ best results had just as many untoward events as those w/ worst but they had better means of detection greater repertoire of responses de Leval 2000 Ghaferi

another important difference resilient vs brittle systems 30

resilience – multiple conceptions first appeared ~1600s from Latin resiliens “to rebound, recoil” re- “back” + salire “to jump, leap” rebound from some traumatic event 31

resilience – multiple concepts robustness expand base capacity to handle more disruptions ‘enlarging design basis’ brittleness vs graceful degradation bring ‘extra’ adaptive capacity to bear in the face of potential for surprise 32

contrasting examples directions GPS (to bullets) CDs, mp3s most digital maps LPs most analog 33

resilience – formal definition the ability of systems to adapt to sustain key operations in the face of expected or unexpected challenges 34

resilience and success not just success in the fact of threats (resilient systems still fail) repertoire of behaviours, shifting performance, trading off goals to dynamically forestall failure, mitigate failure in progress, or seize opportunities “… redirect the failure pathway to another form from which recovery might be easier, less disruptive, less costly” Cook, RI

but a problem resilience only seen through its instantiations like static electricity – can’t see it, but can see lightning 36

epiphenomena “… seeing holes or deficiencies in hindsight is not an explanation of the generation or continued existence and rationalization of those deficiencies.” Dekker, S. W. A. (2011). Drift into Failure: From Hunting Broken Components to Understanding Complex Systems. Farnham, UK: Ashgate. 37

problem for engineering resilience “… seeing heroic recoveries in hindsight is not an explanation of the generation or continued existence and rationalization of those recoveries.” à la Dekker, S. W. A. (2011). Drift into Failure: From Hunting Broken Components to Understanding Complex Systems. Farnham, UK: Ashgate 38

hidden resilience resilience must be present before it is manifested “much of the stock of [a system’s] response is in the form of latent behavioural potential … outside of awareness and taken for granted until interruptions and attempts at recovery call attention to it” Christianson, M. K., Farkas, M. T., Sutcliffe, K. M., & Weick, K. E. (2009). Learning through rare events: significant interruptions at the Baltimore & Ohio Railroad Museum. Organization Science, 20(5),

WAI vs WAD the messy details paramedics told to handoff to ED charge nurse get back out on street faster charge nurse won’t be taking care of pt not as interested in details will hand off to another nurse ‘secret, second handoff’ 40

WAI vs WAD the messy details diagnostic workup for cancer should be ‘fire & forget’ 2/3 of cases required 1 or more additional staff actions no difference in time to dx 41

42

43

risks in human activities no system beyond this point civil aviation nuclear industry railways chartered flight chemical industry (total) fatal risk blood transfusion elective surgery very unsafeultra safeunsafesafe mountaineering professional fishing off shore drilling oil industry (total) anesthesiology asa 1-2 radiotherapy emergency icu oncology medical risk (total) fire fighting satellite launch space missions rotary wing trams, tubes

no system beyond this point civil aviation nuclear industry railways chartered flight drilling industry chemical industry (total) fatal risk anesthesiology asa1 innovative medicine (transplant, oncology …) icu, trauma, ed very unsafeultra safe professional fishing three contrasting safety models unsafesafe mountaineering combat c/c, war time ultra resilient context: taking risks is the essence of the work cult of fighter spirit, champions, heroes, villains safety model: power to experts ‘give me best chances and safest tools to survive in these adverse conditions and make exploits’ safety training: learning through shadowing, acquiring professional experience, "training for zebra", working on knowing one's own limitations unknowable events model ultra safe context: risk is excluded as much as possible cult of applying procedures and safety rules by an effective supervisory organization safety model: power to the regulators of the system to avoid exposing front-line actors to unnecessary risks training in teamwork to apply procedures and manage work even if abnormal events occur precluded events model medical risk (total) radiotherapy blood transfusion elective surgery chronic care reliabilty model context: risk is not sought out, but it is inherent in the activity cult of group intelligence and adaptation to changing situations safety model: power to the group, ability of the group to organize itself (roles), to provide mutual protection to its members, to apply procedures, to react to anomalies, to adapt, perceive changes and make sense of changes in the context training in teamwork to gain knowledge of abilities and adaptability in applying procedures to suit the context react to events model finance fire fighting food industry processing industry more safety-I more safety-II

conclusions – maybe? health care has many resilient systems the sources of that resilience are not clear resilience is being consumed to enhance productivity this is normal (fr Richard Cook) 46

resources workshop and call for papers White Paper on Patient Safety Turning Patient Safety on its Head Plans for 7 th REA Symposium will appear here Fairbanks et al (2014). Resilience and resilience engineering in healthcare. Joint Commission Journal on Quality and Patient Safety, 40(8), Woods, D. (2015). Four Concepts for resilience and the Implications for the Future of Resilience Engineering. Reliability Engineering & System Safety, 141,

contact information Robert L Wears, MD, MS, PhD