Download presentation
Presentation is loading. Please wait.
1
Managing the Risk of Organisational Accidents
International Railway Safety Conference 2007 Goa, India Managing the Risk of Organisational Accidents Peter Cuffe Chief Safety & Security Officer Irish Rail
2
Safety Professionals Railway Organisation Society
3
Organisational Accidents
Individual Accidents A specific persons Agent and victim Limited scope Organisational Accidents Multiple causes Many people Devastating consequences Often a product of technological innovation
4
Organisational Accidents are the result of highly complex coincidences
which are rarely foreseen by those involved. They are unpredictable because of the large number of causes and the spread of information over all the participants
6
Cost of Protection greatly exceeds the dangers
7
Cost of Protection greatly exceeds the dangers
Protection falls far short of required level
8
Better defences convert to improved production
9
Better defences convert to improved production
Post Accident response measures
10
Relaxation with further improved production
Better defences convert to improved production Post Accident response measures
11
Catastrophic Disaster
Relaxation with further improved production Better defences convert to improved production Post Accident response measures
12
Examples of Improved Production
Invention of the Davy Lamp Extract coal from more dangerous areas Mine accidents increased Introduction of Marine Radar Travel faster in fog or busy waters Marine history littered with “radar assisted” collisions
13
Dangers of the Un-Rocked Boat
A lengthy accident-free period Steady erosion of protection Easy to “forget to fear” Increasing production, without extended defences, will erode safety margins
14
Production v Protection
Partnership is rarely equal Production creates the resources for protection Process managers have production skills Production information is: Direct, Continuous & Easily Understood
15
Production v Protection
Partnership is rarely equal Successful Protection shown by Absence of Negative Outcomes Information is indirect or intermittent Hard to interpret, often misleading Awareness often driven by accident or near-miss
16
Defences Create Understanding & Awareness Give Clear Guidance on Safe Operation Alarms & Warnings of imminent danger Restore System to a Safe State Interpose Barriers between Hazards & Losses Contain & Eliminate Hazards Provide Means of Escape & Rescue
17
Create Understanding & Awareness Give Clear Guidance on Safe Operation
Defences Create Understanding & Awareness Give Clear Guidance on Safe Operation Alarms & Warnings of imminent danger Restore System to a Safe State Interpose Barriers between Hazards & Losses Contain & Eliminate Hazards Provide Means of Escape & Rescue Defences-in-depth, successive layers, specific sequence
21
Holes are continuously moving
Defences can be deliberately removed: -Maintenance -Testing -During Failures
27
Undetected Manufacturing Defects Maintenance Failures
Latent Conditions Poor Design Gaps in Supervision Undetected Manufacturing Defects Maintenance Failures Unworkable Procedures Clumsy Automation Poor Training Inadequate Tools & Equipment
28
Active Failures v Latent Failures
Immediate Effect Shortlived Effect Committed at the “sharp” end, at the human-system interface Lie Dormant No Impact until local interaction Spawned in the organisation Pervasive
29
Gravity, Weather, Mountains,
In aviation, there are foreseeable hazards: Gravity, Weather, Mountains, and Human Fallibility
30
Human Error v Non-Compliance
Error - an intrinsic part of the Human Condition Error - distraction Error – loss of situational awareness Error - deliberate We all learn by “trial and error” Necessary to push limits to establish system characteristics
31
We cannot change the Human Condition,
but we can change the Conditions under which Humans work.
32
Investigations Who ? What ? Where ? When ? Why ?
33
Choose “Top Event” (Accident or Near-Miss) Determine Direct Causes
PRISMA Choose “Top Event” (Accident or Near-Miss) Determine Direct Causes Determine Preceding Causes Stop when the Facts Stop Stop at limits of Organisational Control
34
3 x Staff, Knowledge based 6 x Staff, Rule based
PRISMA – 23 Categories 4 x Technical 5 x Organisational 3 x Staff, Knowledge based 6 x Staff, Rule based 2 x Staff, Skill based 1 x Customers 1 x Public 1 x Unclassifiable
35
Rasmussen’s SRK Model 4 x Technical 5 x Organisational
PRISMA – 23 Categories 4 x Technical 5 x Organisational 3 x Staff, Knowledge based 6 x Staff, Rule based 2 x Staff, Skill based 1 x Customers 1 x Public 1 x Unclassifiable Rasmussen’s SRK Model
36
Knowledge based Behaviour
Unexpected or new situations High attention level Problem identification and solving Situational awareness Understanding of Process Analytical ability
37
Recognition of situation Pattern identification
Rule based Behaviour Recognition of situation Pattern identification Medium attention demand Training to ensure correct rule application
38
Reflex/automatic reactions Long learning process Low attention demand
Skill based Behaviour Reflex/automatic reactions Long learning process Low attention demand Triggered by environmental signals Unlearning very difficult Stress resistant Error prevention by environmental change, not by altered behaviour
39
Technical Organisational PRISMA – 23 Categories External
Design (Ergonomics) Construction/Maint. Material (further research required) Organisational External Supervision Rules/Procedures Management Priorities Culture
40
Knowledge Based Rule Based PRISMA – 23 Categories External
Process Status/ Characteristics (eg current permits to work) Improper Goals (eg making up for lost time by speeding) Rule Based License/Certified Competency Incorrect Permits or other Safeguards Pre-work Status Check not done Work sequence incorrect or incomplete Failure to monitor other system characteristics Failure to use correct resources
41
Customer (eg Inebriated passenger)
PRISMA – 23 Categories Skill Based Intentional (eg typing error) Unintentional (eg leaning against controls) Customer (eg Inebriated passenger) Public (eg Suicide) Unclassifiable (eg Act of God)
42
Causal Tree
43
Causal Tree
44
Causal Tree
45
Causal Tree
46
Causal Tree
47
Causal Tree
48
Causal Tree
49
Causal Tree
50
Causal Tree
51
Causal Tree
52
Causal Tree
53
Causal Tree
54
Causal Tree
55
PRISMA Applied to Chemical Industry, Healthcare and Railway (SPADs)
Analysis 1: Historic investigation findings were re-classified into PRISMA terms. Analysis 2: The same incidents re-analysed with PRISMA, using existing Inspectorate files. Analysis 3: New incidents were analysed with PRISMA, using appropriate data gathering.
56
PRISMA Applied to Chemical Industry, Healthcare and Railway (SPADs)
57
PRISMA Applied to Chemical Industry, Healthcare and Railway (SPADs)
58
PRISMA Applied to Chemical Industry, Healthcare and Railway (SPADs)
59
Reliability is Invisible Reliable outcomes are constant
There is nothing to pay attention to We see nothing so ‘nothing’ is happening And nothing will continue to happen. This is a deceptive diagnosis – Dynamic inputs create stable outcomes Safety is a Dynamic Non-Event
60
Accidents do not occur because we gamble and lose,
but because we do not believe that the accident about to occur is at all possible
61
Accidents do not occur because we gamble and lose,
but because we do not believe that the accident about to occur is at all possible
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.