Human Performance and Patient Safety

Slides:



Advertisements
Similar presentations
Accident and Incident Investigation
Advertisements

Culture and Leadership
OSHA’s Voluntary Protection Program (VPP) Job Hazard Analysis Mishap reporting 1 This class is only intended to familiarize you with the programs in place.
Accident Investigation for Supervisors
Leadership III for fire and ems: strategies for supervisory success
ORÇUN B İ LEN Maintenance involves fixing any sort of mechanical or electrical device which has become out of order or broken. It also includes.
HSE’s Ageing and Life Extension Key Programme (KP4) and Human Factors
The Field Guide to Human Error Investigations Chapters 7 – 13 “The New View of Human Error” AST 425.
Human Performance Todd Conklin PhD Royce Railey Sr. Safety Advisor
ERROR TRAPS: FINDING YOUR NEXT FAILURE Paul Gantt, CSP, CET Ron Gantt, CSP, CET.
“Human beings by their very nature make mistakes; therefore, it is unreasonable to expect error-free human performance.” Shappell & Wiegmann, 1997.
Brownfields 2013 Ron Snyder, HMTRI/CCCHST Adapted from: Todd Conklin PhD Los Alamos National Laboratory.
Accident Causes, Prevention and Control
® Problem Solving for Root Cause Analysis An overview for CLARION Case Competition 2009 Presented by: Sandra Potthoff, Ph.D. Director of Program in Healthcare.
Understanding systems and the impact of complexity on patient care
Introduction to Root Cause Analysis Understanding the Causes of Events
Ethical Issues in Business
Project Management for Public Health Professionals February 2011 David Sabapathy, MD, MBA, PEng Department of Community Health Sciences 3 rd floor TRW.
Why do people make mistakes? Learning Lite
Coaching and Performance Management
 Students will be able to:  List items in a AI plan  List items to include in an AI kit  Explain why human error could be a cause or a symptom of.
JOB HAZARD ANALYSIS Example Guide.
Please read this before using presentation This presentation is based on content presented at the Industry Forum on Reducing.
OCCUPATIONAL SAFETY AND HEALTH IN GHANA Ghana has a mixed economy dominated by agriculture, commerce, service and industry.
Facilitating Safe and Reliable Operations! MME-1 Managing Maintenance Error Using Human Performance Improvement Department of Energy Human Performance.
Torrington, Hall & Taylor, Human Resource Management 6e, © Pearson Education Limited 2005 Slide 22.1 Protection from Hazards Conflict between needs for.
The Emergence of a Just Culture Where We Were Where We May Be Where We Need to Go.
The Field Guide to Human Error Investigations- The Old View (Chapters 1 – 6) By Dekker AST 425.
Hard Work and Vigilance: Necessary but Insufficient The Role of Human Factors in General Practice Dr Richard Jenkins Tuesday 2 nd November 2010.
Topic 5 Understanding and learning from error. LEARNING OBJECTIVE Understand the nature of error and how health care can learn from error to improve patient.
How Would You Know You’re Not As Good As You Think You Are? Learning To Learn Bill Rigot1.
Elder Mistreatment: A geriatrician’s perspective Hal H. Atkinson, MD, MS Associate Professor Department of Internal Medicine, Section on Gerontology and.
Component 2: The Culture of Health Care Unit 9: Sociotechnical Aspects: Clinicians and Technology Lecture 1 This material was developed by Oregon Health.
11th International Symposium Loss Prevention 2004 Prague Ľudovít JELEMENSKÝ Department of Chemical and Biochemical Engineering, STU BRATISLAVA, SLOVAKIA.
©2013 Cengage Learning. All Rights Reserved. Business Management, 13e Planning and Organizing The Planning Function Using Planning Tools.
Impact of Implementation of Safety Management Systems (SMS) on Risk Management and Decision-Making Kathy Fox, Board Member System Safety Society – Canada.
Hazard Identification
A guide to... Safe Systems of Work.
What is an accident and why should it be investigated?
Topic 3 Understanding systems and the impact of complexity on patient care.
Risk Management: Understanding Success Criteria Dr. Carl Marx.
Is an SMS enough to make our organisations safer? By Jan Peeters RIO DE JANEIRO – 20/10/2015 SRMCOACH.EU 1.
Introduction to Work and Organizational Psychology Gerhard Ohrband 11 th lecture Safety at work.
I DENTIFYING C AUSES OF A CCIDENTS Surface vs. Root Causes Surface causes are: the hazardous conditions or unsafe work practices that directly or indirectly.
Safe Systems of Work. Legislation w HSWA Section 2 (2) (a): Provide and maintain plant and systems of work that are, so far as is reasonably practicable,
PROMOTING PATIENT SAFETY BY PREVENTING MEDICAL ERRORS Safety concerns facing health care systems today.
Human Factors in Accident Investigation
Greg Baker © Part One The Foundations – A Model for TQM Chapter # 1 Understanding quality.
Enclosed Spaces – the human dimension Marc Williams Human Element Policy Manager MCA.
Strategic Health and Care Commissioning Work This report says how this work will be done.
COSCAP-SA1 AERODROME CERTIFICATION COURSE AERODROME CERTIFICATION COURSE Safety Management System An introduction to the principles and concepts associated.
Human & Organizational Performance – H.O.P.
LECTURE 7 AVIATION SAFETY & SECURITY
Karon Cormack Head of Clinical Risk.  “the scientific study of the relationship between man and his working environment” (Murell, 1965)  “the study.
Human Performance Improvement/ HRO
The Human Face of Medical Error Unit 8. Quote of the Week “The only real mistake is the one from which we learn nothing.” John Powell.
Unit 3: Assessing Risk in Sport
ACCIDENT CAUSATION.
Human error A. H. Mehrparvar, MD Occupational Medicine department
Safety Culture Introduction
ROOT CAUSE ANALYSIS RCA
A Guide to Determining Accident Preventability
Safe Systems of Work.
PROMOTING PATIENT SAFETY BY PREVENTING MEDICAL ERRORS
CAUSE ANALYSIS CA
CASE STUDY: A survey conducted in an open cast iron ore mine indicated that first-line truck drivers and crane operators showed increasing signs of fatigue,
Human Factors & Patient Safety
Part One The Foundations – A Model for TQM
Situation Monitoring Know the plan, share the plan, review the risks.
Upstream Americas Road Transport Trends & Chronic Unease
Presentation transcript:

Human Performance and Patient Safety Jim McMenemy Winnipeg Regional Health Authority

Outline Human Factors definition Human Error Evolution of Human Error Understanding Organizations and Socio-technical systems Vulnerability and Countermeasures

Meaning of Human Factors What do we mean by “Human Factors”? “Human Factors is concerned to optimize the relationship between people and their activities, by the systematic application of human sciences, integrated within the framework of systems engineering.” (ICAO Digest No. 1) EO#1: EXPLAIN WHAT IS THE DISCIPLINE OF HUMAN FACTORS ACCORDING TO ICAO AND BRIEFLY EXPLAIN SOME OF THE TOPICS THAT ARE COVERED BY THE DISCIPLINE AND SOME OF THE TOPICS THAT ARE NOT. <Advance the first line of the slide and ask the group> What do we mean by Human Factors? <Discuss and encourage the group to get to the broader definition that includes more than just individual human limitations> <Advance the second line of the slide> According to ICAO, “Human Factors is concerned to optimize the relationship between people and their activities, by the systematic application of human sciences, integrated within the framework of systems engineering.” Human Factors is about people [interacting with technology]: it is about people in their working and living environments, and it is about their relationship with equipment, procedures and living environments . Just as importantly, it is about their relationships with other people. Its twin objectives can be seen as safety and efficiency (ICAO Circular 227). The field is cross-disciplinary in that it involves operations (different industries), psychology, kinesiology, engineering, sociology, computer studies and other fields. It demands that people work together to understand performance and make the system robust. The Human Factors courses that you have taken in your operational careers (flight ops, ATC, maintenance, etc.) focus on teaching operational folks how to individually manage individual human factors (stress, fatigue, inattention, etc.). Before the implementation of SMS we tried to move everyone to understand that Human Factors was about organizationally managing individual human factors (shift schedules, communication protocols, etc.). With our transition to SMS, it is clear we now have a much greater understanding of the field applied to aviation and we are now demanding organizations manage all human-technology interactions at the system level (individual, organization, equipment, procedural, environmental - human interactions); we now know a systems/organizational approach is much more successful than the individual approach. This is how systemic issues and solutions are found. It is critical that we begin to advance this understanding. All TCCA staff need to understand not only the individual human factors faced in day-to-day operational work, but also the human-equipment factors, the human-procedure factors, the human-environment factors, the human- organizational factors and so on - organizationally how to manage all human factors issues (human-technology interactions).   The discipline of Human Factors is much wider in scope than has traditionally been thought by the aviation industry. With the increase in complexity in our aviation system we must begin to expand everyone's definition and understanding of Human Factors. ICAO defines Human Factors as the discipline "concerned to optimize the relationship between people and their activities, by the systematic application of human sciences, integrated with the framework of systems engineering." (ICAO Digest No.1). The International Ergonomics Association sees no difference between the words ergonomics and human factors and defines it as: "the scientific discipline concerned with the understanding of the interactions among humans and other elements of a system, and the profession that applies theory, principles, data, and methods to design in order to optimize human well-being and overall system performance".  In simple terms Human Factors is about optimizing the human-technology relationship to accomplish work. Technology here is used broadly - a procedure is a piece of technology. One of the goals of discipline is to manage systems well to support human goals, while protecting against human limits. To give a specific example, we want to understand what are some of the more frequent human-technology breakdowns and methods to identify these breakdowns in context so appropriate mitigations can be implemented. The context in which interactions breakdown is critically important. Being able to identify and describe the context in which human performance takes place and explain how the context contributed is a critical skill in the practice of Human Factors. With SMS our goal is to have industry do this work, however, our inspectors must also understand how this work is done for their oversight responsibilities. Inspectors will need to know about Human Factors methods (e.g. Reason's model of accident causation), they will need to know the attributes of good outcomes of Human Factors methods (e.g. accident reports), they will need additional language to explain organizational factors and context... and so on. 

Human Error Knowledge and Error flow from the same mental source; only success can tell one from the other. Ernst Mach (1905)

Human Error What is Human Error? Human Error is a generic term used to describe all those occasions where a planned sequence of mental or physical activities fails to achieve its intended outcome, and when these failures cannot be attributed to outside intervention (Reason, 1990). EO#3. DEFINE HUMAN ERROR, BRIEFLY DESCRIBE THE HISTORY OF HUMAN ERROR, EXPLAIN THE OLD VIEW AND THE BAD APPLE THEORY (BASIC ATTRIBUTION ERROR), WHY IT IS PROBLEMATIC, AND HOW THE NEW VIEW HELPS US LEARN FROM OUR ERRORS. A classification scheme used by ICAO: “SLIPS AND LAPSES” are errors which result from some failure in execution of an action sequence, regardless of whether or not the plan which guided them was adequate to achieve its objective. “MISTAKES” are failures in the selection of an objective or the means to achieve it, irrespective of whether or not the actions directed by the ‘scheme’ run according to plan. Slips and lapses are essentially conditioned or automatic responses, with little , if any, conscious decision making. Mistakes involve deliberate decision-making and evaluation, based on knowledge, experience and mental models that have worked well in the past. “ADAPTATIONS” can be defined as deliberate, but not necessarily reprehensible, deviations from those practices deemed necessary (by designers, managers, regulators) to ensure safe operation.

Traditional Approach #1 People make mistakes on the job because of: Stupidity Carelessness Complacency Incompetence, etc.

Traditional Error Prevention Make rules Enforce rules Punish violators Fire them Suspend them Retrain them Counsel them If you follow the rules you cannot have an accident

Traditional Approach #2 Humanistic Accidents happen because of Human Error People do not try to make mistakes They must be Broken, defective, deficient….. Therefore “Fix the people”

Error Prevention by Fixing the People Decision-making training Be more: Vigilant Careful More, more, more…. But…. They weren’t broken

Human Error Why is the Old View so popular? Cheap and easy Saving face Personal responsibility and the illusions of omnipotence Cheap and easy: it is a deviously simple approach. It is cheap to implement. The Old View believes failure is an aberration, a temporary hiccup in an otherwise smoothly performing, safe operation. Nothing more fundamental, or more expensive, needs to be changed. Saving face: In the aftermath of failure, pressure can exist to save public image. To do something immediately to return the system to a safe state. Taking out defective practitioners is always a good start to saving face. It tells people that the mishap is not a systemic problem, but just a local glitch in an otherwise smooth operation. You are doing something; you are taking action. The fatal attribution error and the blame cycle is alive and well. Personal responsibility and the illusions of omnipotence: practitioners in safety- critical systems usually assume great personal responsibility for the outcomes of their actions. Practitioners get trained an paid to carry this responsibility. But the flip side of taking this responsibility is the assumption that they have the authority, the power, to match it. The assumption is that people can simply choose between making errors and not making them – independent of the world around them. In reality, people are not immune to pressures – and organizations would not want them to be (a Coast Guard helicopter pilot who never flies in poor visibility would not be a Coast Guard pilot for long). To err or not to err is not a choice. People’s work is subject to and constrained by multiple factors. How many times have you cursed your kids or your spouse or your sibling for doing something… without considering the factors that were present at the time? The Old View is a very normal reaction to failure. The problem is – we cannot make progress on safety with this view.

Human Error Underestimate the influence of the situation. Basic Attribution Error: Tendency to attribute behaviour to an enduring quality of the person AND Underestimate the influence of the situation. The basic attribution error is the psychological way of describing the Old View… All humans have a tendency, when examining the behaviour of other people, to over- estimate the degree to which their behaviour results from permanent characteristics, such as attitude or personality and to under-estimate the influence of the situation.

Human Error Where the Old View falls short Local rationality If your explanation still relies on unmotivated people, you have more work to do You have to assume that nobody comes to work to do a bad job You have to understand why what people did made sense to them at the time. Local rationality: People are doing reasonable things given their point of view and focus of attention; their knowledge of the situation; their objectives and the objectives of the larger organization in which they work. People in safety-critical jobs are generally motivated to stay alive, to keep their passengers and customers alive. They do not go out of their way to fly into mountainsides or windshear; to damage equipment, to install components backwards… In the end, what they are doing makes sense to them at that time. It has to make sense, otherwise they would not be doing it. So if you want to understand human error, your job is to understand why it made sense to them. Because if it made sense to them, it may well make sense to others, which means that the problem may show up again and again. If you want to understand human error, you have to assume that people were doing reasonable things given the complexities, dilemmas, trade-offs and uncertainty that surrounded them. Just finding and highlighting people’s mistakes explains nothing. Saying what people did not do, or what they should have done, does not explain why they did what they did.

Local Rationality Humans are the most flexible, adaptable, and valuable part of the system while at the same time they are most vulnerable to influences which can adversely affect performance. Most accidents have been attributed to ‘human error’… so the statistics say… Do we fix the people or the system in which the people work? To prevent accidents we address the causal, contributing and underlying factors of the system in which people work. 2006-11-03

Human Error “Underneath every simple, obvious story about error, there is a deeper, more complex story…” “Take your pick: Blame human error or try to learn from failure…” (Dekker, 2006)

Human Error New View of Human Error on what goes wrong: Human Error is a symptom of trouble deeper inside a system To explain failure, do not try to find where people went wrong Instead, find out how people’s assessments and actions made sense at the time given the circumstances that surrounded them The New View was born out of recent insights in the field of Human Factors, specifically the study of human performance in complex systems and normal work. Sources of error are structural, not personal. If you want to understand human error, you have to dig into the system in which people work. You have to stop looking for people’s personal shortcomings; Errors and accident are only remotely related. Accidents emerge from the system’s complexity, not from its apparent simplicity. That is, accidents do not just result from a human error, or a procedural “violation”. It takes many factors, all necessary and only jointly sufficient, to push a system over the edge of failure; Accidents are not the result of a breakdown of otherwise well-functioning processes. You think your system is basically safe and that accidents can only happen if somebody does something really stupid or dangerous. Instead, the research is showing us how accidents are actually structural by-products of a system’s normal functioning. What is striking about many mishaps is that people were doing exactly the sorts of things they would usually be doing- the things that usually lead to success and safety. People are doing what makes sense given the situational indications, operational pressures and organizational norms existing at the time. Accidents are seldom preceded by bizarre behaviour. To adopt the new view you must acknowledge that failures are baked into the very nature of your work and organization; that they are symptoms of deeper trouble or by-products of systemic brittleness in the way you do your business. It means having to acknowledge that mishaps are the result of everyday influences on everyday decision making, not isolated cases of erratic individuals behaving unrepresentatively. Ift means having to find out why what people did back there and then actually made sense given the organization and operation that surrounded them.

System-induced violations System-induced errors Human Error 10% 90% Culpable Blameless Sabotage Substance abuse Reckless violations etc. System-induced violations System-induced errors etc. Very few people are found culpable and were actually malicious in their intent – most are blameless and were actually trying to do their very best in an imperfect system. With the increase in the tendency to charge pilots, mechanics, doctors and other practitioners with criminal negligence, it is important for us to really understand the influence of the system. To adopt the new approach to human error, we have to understand why it is so easy to believe the old approach and take conscious steps towards the new view. There are lots of examples where the general population and the strength of the Old View seriously affected the lives of people who were just trying to do a good job in an imperfect system: Singapor Six, Denver nurses trial, Oscar November, etc….

Organizations and Socio-technical Systems Some system defences: Physical design aspects Job design elements Adequate resources Company safety management systems Effective regulatory system National legislation International agreements… In order to understand how decision makers’ actions or inactions influence safety, it is necessary to introduce a contemporary view of accident causation. As a complex socio-technical system, aviation requires the precise coordination of a large number of human and technical elements to function. The aviation system utilizes an elaborate array of systemic safety defences to protect against human errors. These defences include such things as: Physical design aspects: controls and displays, safety guards, special tools Job design elements: sequencing of tasks, procedural compliance, readbacks, documentation of work done Adequate resources: equipment, trained personnel Company safety management systems: incident reporting, trend analysis, safety audits An effective regulatory system: air regulations, safety oversight and enforcement National legislation: establishment and organization of civil aviation administration, aviation laws International agreements: ICAO, SARPs, JARs Accidents in such a well-defended system are the product of the confluence of a number of enabling factors, each one essential but not sufficient along to breach system defences. However, sometimes it is the complexity of these defences that can result in an accident. Another aspect of normal socio-technical system performance not explained by this organization model is the drift or migration of work practices from prescribed policies and procedures over time. Because these systems are so well-defended, with many defences located in different parts of the organization, small changes in practice to deal with local conditions lead to interactive complexity and decoupling of interdependent activities. These changes, drifts from prescribed activities to practiced activities, result from two primary forces: the drive for productivity and the drive for efficiency. One model of socio-technical system performance that goes beyond James Reason’s model, that addresses the dynamic nature of performance is Rasmussen’s risk management framework.

Wiener's "Iron Law" ...if equipment is designed correctly for human use in the first place, the cost is high, but it is paid only once. If poor design must be compensated for in training departments and operations, the price must be paid every day. And what is worse, with weak, potentially error-inducing designs, one cannot be sure that when the chips are down, the correct responses will be made. Wiener, Earl, L. (1993) Intervention Strategies for the Management of Human Error. NASA Contractor Report 4547. P. 13.

Error & Vulnerability Attention is a finite resource Overload Distraction Interruption Fatigue – 17 hours = .05% BAC Equipment design Team coordination

Summary Meaning of Human Factors Human Error From Organizations to Socio-technical Systems Vulnerability & Countermeasures

Recommended Reading The Human Factor (Kim Vicente) The Field Guide to Understanding Human Error (Sidney Dekker) Managing the Risks of Organizational Accidents (James Reason) 10 Questions About Human Error (Sidney Dekker)