Presentation is loading. Please wait.

Presentation is loading. Please wait.

Human Error and Error Wisdom

Similar presentations

Presentation on theme: "Human Error and Error Wisdom"— Presentation transcript:

1 Human Error and Error Wisdom

2 Human error “We all make errors irrespective of how much
training and experience we possess or how motivated we are to do it right”. Reducing error and influencing behaviour - HSG48

3 The rset can be a total mses and you can sitll raed it fialry eailsy.
Aoccdrnig to rscheearch at Cmabrigde Uinervtisy, it deosn't mttaer in waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht the frist and lsat ltteer are in the rghit pclae. The rset can be a total mses and you can sitll raed it fialry eailsy. Tihs is bcuseae the huamn mnid deos not raed ervey lteter by istlef, but the wrod as a wlohe.

4 ? Used to be a 4th vial in common use in all wards and Depts (now only in Pharmacy, ITU, Theatres) = Potassium Chloride (Strong Potassium/KCL) Used diluted in Healthcare Used undiluted / in larger doses to administer death Penalty in other countries Examples of Areas Requiring Design Solutions This demonstrates one of many types of drug packaging which are similar ,so that one drug could easily be mistaken for another.

5 The Perfection Myth The Punishment Myth - if we try hard enough we will not make any errors - if we punish people when they make errors they will make fewer of them

6 Pursuit of (wrong kind of) excellence
Getting the balance right SYSTEM MODEL PERSON MODEL Pursuit of (wrong kind of) excellence Blame & Denial Isolation Learned Helplessness Both extremes have their pitfalls

7 How do accidents happen? Patient Safety Incident
Organisation + process deficiencies - (SDPs) Prior/unsafe conditions - Contributory factors Unsafe acts - (CDPs) / (SRK errors) Failed defences Patient Safety Incident

8 Service Delivery Problem (SDP)
Latent failure Distant from direct patient care Arise from weaknesses in the organisation or environment e.g. failure to undertake an environmental risk assessment in a ward

9 Contributory factors Patient factors Individual / staff factors
Task factors Communication factors Team & social factors Education & training factors Equipment and resource factors Working condition factors Organisational & management factors

10 Care Delivery Problem (CDP)
Active failure Arises in process of direct patient care Act or omission by member of staff e.g. failure to undertake planned 15min obs. of patient

11 Rasmussen’s Skill, Rule and Knowledge (SRK) model
Automatic, familiar & well practiced routines Skill Conscious Thought Learning rules and rehearsing routines Rule Novel task Knowledge

12 Error Wisdom Predict ‘what can go wrong today’
3 2 3 1 2 3 1 SELF James Reason talks of how we can often ignore various unresolved minor problems arising while we are working in skill base, He likens this to having 3 buckets each with varying amounts of’ brown stuff’ in them. …What would it take to make alarm bells ring in the heads of those confronted with a high risk situation? Nurses and junior doctors have little opportunity to make radical changes to the system. But could we not provide them with some basic mental skills that would help them to recognise and, if possible, avoid situations with a high error potential? The three bucket model shown leads to a possible strategy. In any given situation, the probability of unsafe acts being committed is a function of the amount of bad stuff in all three buckets. The first relates to the current state of the individual(s) involved, The second reflects the nature of the context, The third depends upon the error potential of the task. While most professionals will have an understanding of what comprises bad stuff in regard to the self (lack of knowledge, fatigue, negative life events, inexperience, feeling under the weather) …and the context (distractions, interruptions, shift handovers, harassment, lack of time, unavailability of necessary materials, unserviceable equipment), …they are less likely to know that individual task steps vary widely in their potential to elicit error. Premature exit - Goal achieved before task complete Lack of cueing Out of sight out of mind For example, omission errors are more likely in steps close to the end of a task, or where there is lack of cueing from the preceding step, or when the primary goal of the task is achieved before all necessary steps have been completed, and so on. Full buckets (with respect to bad stuff) do not guarantee the occurrence of an unsafe act, nor do nearly empty ones ensure safety (they are never wholly empty). NB: We are dealing with probabilities rather than certainties. People are very good at making rapid intuitive ordinal ratings of situational aspects, Together with some relatively inexpensive instruction on error provoking conditions, frontline professionals could acquire the mental skills necessary for making a rough and ready assessment of the error risk in any given situation. Subjective ratings totalling between six and nine (each bucket has a three point scale, rising to a total of nine for the situation as a whole) should set the alarm bells ringing. However, as stated earlier, these skills need to be exercised regularly. There is considerable evidence to show that mental preparedness—over and above the necessary technical skills—plays a major part in the achievement of excellence in both athletics and surgery. The three bucket model and its associated toolkit emphasise the following aspects of preparedness: Accept that errors can and will occur Assess the local bad stuff before embarking upon a task Have contingencies ready to deal with anticipated problems Be prepared to seek more qualified assistance Do not let professional courtesy get in the way of checking your colleagues’ knowledge and experience, particularly when they are strangers Appreciate that the path to adverse incidents is paved with false assumptions. CONCLUSIONS It is evident from the case study discussed above that organisational accidents do occur in healthcare institutions. The identification of organisational accidents enjoins us to ask how and why the safeguards failed. It also requires not only the remediation of the defective barriers, but also regular audits of all the system’s defences. The same event never happens twice in exactly the same way. It is therefore necessary to consider many possible scenarios leading to patient harm. This would truly be proactive safety management because the latent ingredients of future adverse events are already present within the organisation. Instilling informed vigilance and intelligent wariness in those at the sharp end need not consume much time. …we should raise our feral vigilance (Wild / squirrel-like awareness) when our buckets of brown stuff starts to fill ie acknowledge eg tiredness, pressure etc - stop and think or “alert” colleagues to your situation Note Similarities with the error chain model used in aviation 2 1 CONTEXT T ASK Three bucket model of error likelihood – James Reason 2004

13 ERROR TYPES – based on the work of Reason, adapted by NPSA
Rule & Knowledge Based errors Routine Reasoned Reckless & Malicious Mistakes Violations Basic error types Unintended actions Intended Unsafe acts Skill based errors Memory failures Routine violations = we always do it a different way here OR everyone does it like this Reasoned violations - Situational (For this patient we need to do it a different way) - Exceptional (if we don’t do it a different way the patient will suffer) - Optimising (can do it better another way) Reckless / Sabotage – deliberately do it differently knowing it is risky All staff should be free from discipline unless... Premeditated or intentional acts of violence against people or damage to equipment/property Actions or decisions involving a reckless disregard towards the safety of patients Failure to report safety incidents or risks Slips Lapses Skill based errors Attentional failures

14 Def: Human Factors The study of how humans behave physically and psychologically in relation to particular environments, people, or procedures.

15 Lessons from Human Factors Research
Errors are common and predictable The causes of errors are known Errors are by-products of useful cognitive functions Errors can be prevented by designing tasks and processes to minimise dependency on weak cognitive functions

16 Examples of Other Human Factors
Fatigue; Sleep deprivation Inadequate nutrition, hydration Overload Training and experience Professional courtesy Team dynamics (isolated, divided, elite) Leadership (weak, charismatic) Example outcomes : Perceptual and contextual problems ……….

17 Perception

18 Contextual clues leading to error
(Bum steers) Oak Joke Croak Cloak What tree grows from an acorn? What do you call a funny story? What sound does a frog make? What’s another word for a cape? What do you call the white of an egg?

19 Human performance: Two aspects
Standardisation and Improvisation go hand in hand ... there is no tension Dr Atul Gawande Human as hazard Human as hero Slips Lapses Mistakes Violations Adjustments Compensations Recoveries Improvisations

20 Humans as Heroes Error is normal
Humans are bad at routine but good at compensation/recovery Human coping resources are good Humans have capacity for realistic optimism Good compensators have good outcomes

21 Reason, PS Congress 08 Reason, PS Congress 08

22 Reason, PS Congress 08

23 Error Types Intended actions
Routine violations - regular short-cuts in tasks made for convenience. They are accepted by the clinical team, and sometimes by management, normally because the procedure is badly designed. Reasoned violations - occasional changes in procedure for good reason and with good intent. It may be an emergency or unusual situation. The change should be discussed beforehand wherever possible and always documented afterwards. Reckless violations - unacceptable changes in procedure. Harm is likely but not intended. There is an active lack of care. Malicious violations - deliberate acts that are intended to cause harm or damage. They are unusual but the outcome is likely to be very serious. Rule based mistakes - made by people undertaking tasks with some knowledge of the rules and with good intent, but they choose the wrong solution for the problem. Knowledge based mistakes - made by people undertaking new tasks with good intent but their limited knowledge results in a mistake. They don’t know that they don’t know. Unintended actions Lapses - errors made by experienced people undertaking familiar tasks with very little conscious thought. They forget something routine when they are not concentrating on the task or when they are interrupted. Slips - errors made by experienced people undertaking any task. There is a slip in the action [such as dropping an instrument] which could happen to anyone, however experienced.

24 Either we manage human error... ... or human error will manage us
Professor James Reason

25 Key Points - Human Error
The reasons things go wrong are fairly predictable Humans are generally bad at routine and good at compensation / recovery We need to use this wisdom to identify the true causes of incidents ... and the most effective solutions

Download ppt "Human Error and Error Wisdom"

Similar presentations

Ads by Google