Presentation is loading. Please wait.

Presentation is loading. Please wait.

Erik Hollnagel, 2000 Humans in safety critical systems.

Similar presentations

Presentation on theme: "Erik Hollnagel, 2000 Humans in safety critical systems."— Presentation transcript:

1 Erik Hollnagel, 2000 Humans in safety critical systems

2 Erik Hollnagel, 2000 Estimated number of “human errors” 10 20 30 40 50 60 70 80 100 90 19601965197019751980198519901995 % Human action attributed as cause The diagram shows the attribution of “human errors” as causes, which may be different from the contribution of “human errors” to incidents / accidents.

3 Erik Hollnagel, 2000 Undetected Detected but not recovered Detected but tolerated Detected and recovered What is an “error”? Correctly performed actions Overt effects Latent effects Actual outcomes = intended outcomes Incorre ct actions

4 Erik Hollnagel, 2000 Technology centred- view Human-centred view Humans are a major source of failure. It is therefore desirable to design the human out of the system. Humans are the main resource during unexpected events. It is therefore necessary to keep them in the system. Automation permits the system to function when the limits of human capability have been reached. The conditions for transition between automation and human control are often vague and context dependent. Automation is cost- effective because it reduces the skill- requirements to the operators. Automation does not use humans effectively, but leaves them with tasks that cannot be automated - because they are too complex or too trivial. Humans and system safety Conclusion: Humans are necessary to ensure safety

5 Erik Hollnagel, 2000 Ironies of automation The basic automation “philosophy” is that the human operator is unreliable and inefficient, and therefore should be eliminated from the system. “Designer errors can be a major source of operating problems.” “The designer, who tries to eliminate the operator, still leaves the operator to do the tasks which the designer cannot think how to automate.” Lisanne Bainbridge (1987), “Ironies of automation” 1 2

6 Erik Hollnagel, 2000 Automation double-bind Humans are fallible, and should therefore be designed “out” of the system Safety critical event Design teams are fallible, therefore humans are required in the system

7 Erik Hollnagel, 2000 Maintaining control Being in control of the situation means: What causes the loss of control? What can help maintain or regain control? Unexpected events Acute time pressure Not knowing what happens Not knowing what to do Sufficient time Good predictions of future events Reduced task load Clear alternatives or procedures Knowing what will happen Knowing what has happened Not having the necessary resources Capacity to evaluate and plan

8 Erik Hollnagel, 2000 Goals for what to do when something unusual happens: Goals [Identify, Diagnose, Evaluate, Action] Directs / controls Cyclical HMI model Informatio n / feedback Next action Current understandi ng Provide s / produce s Modifie s Tea m

9 Erik Hollnagel, 2000 Loss of accuracy increases unexpected information Provide s / produce s Effects of misunderstanding Leads to Unexpecte d informatio n / feedback Inadequate actions Incorrect or incomplete understandin g Increases demands to interpretatio n The dynamics of the process only leaves limited time for interpretation Operator may lose control of situation

10 Erik Hollnagel, 2000 Prevention and protection Prevention (control barriers): Active or passive barrier functions that prevent the initiating event from occurring. Protection (safety barriers): Active barrier functions that deflect consequences Protection (boundaries): Passive barrier functions that minimise consequences Accident Initiating event (incorrect action)

11 Erik Hollnagel, 2000 Types of barrier systems  Material barriers  Physically prevents an action from being carried out, or prevents the consequences from spreading  Functional (active or dynamic) barriers  Hinders the action via preconditions (logical, physical, temporal) and interlocks (passwords, synchronisation, locks)  Symbolic barriers (perceptual, conceptual barriers)  requires an act of interpretation to work, i.e. an intelligent and perceiving agent (signs, signals alarms, warnings)  Immaterial barriers (non-material barriers)  not physically present in the situation, rely on internalised knowledge (rules, restrictions, laws)

12 Erik Hollnagel, 2000 Barrier system types  Physical, material  Obstructions, hindrances,...  Functional  Mechanical (interlocks)  Logical, spatial, temporal  Symbolic  Signs & signals  Procedures  Interface design  Immaterial  Rules, laws

13 Erik Hollnagel, 2000 Barriers systems on the road Physical: works even when not seen Symbolic: requires interpretation

14 Erik Hollnagel, 2000 Classification of barriers Material, physical Function al Symbolic Immateri al Containing Restraining Keeping together Dissipating Preventing (hard) Preventing (soft) Hindering Countering Regulating Indicating Permitting Communicatin g Monitoring Prescribing Walls,fences, tanks, valves Safety belts, cages Safety glass Air bags, sprinklers Locks, brakes, interlocks Passwords, codes, logic Distance, delays, synchronisation Function coding, labels, warnings Instructions, procedures Signs, signals, alarms Work permits, passes Clearance, approval Monitoring Rules, restrictions, laws

15 Erik Hollnagel, 2000 Barrier evaluation criteria  Efficiency: how efficient the barrier is expected to be in achieving its purpose.  Robustness: how resistant the barrier is w.r.t. variability of the environment (working practices, degraded information, unexpected events, etc.).  Delay: Time from conception to implementation.  Resources required. Costs in building and maintaining the barrier.  Safety relevance: Applicability to safety critical tasks.  Evaluation: How easy it is to verify that the barrier works.

Download ppt "Erik Hollnagel, 2000 Humans in safety critical systems."

Similar presentations

Ads by Google