Presentation is loading. Please wait.

Presentation is loading. Please wait.

Arete-Zoe: Lessons from TMI Lessons from big industrial accidents and their relevance for the Pharma industry.

Similar presentations


Presentation on theme: "Arete-Zoe: Lessons from TMI Lessons from big industrial accidents and their relevance for the Pharma industry."— Presentation transcript:

1 Lessons from TMI Lessons from big industrial accidents and their relevance for the Pharma industry

2 ARETE-ZOE, LLC Registered address: 1334 E Chandler Blvd 5A-19, Phoenix 85048 AZ, USA Solutions to complex problems in the high stakes and high consequence environment of Global Pharmaceuticals, including clinical research, healthcare informatics, and public health. We blend established, Pharma sector methodologies, innovation, and adaptations/transfers from other sectors to identify and resolve consequential practices that pose risk and often result in avoidable patient casualty.

3 Three Mile Island, PA, USA (1979)

4 Reactor Cadmium rods Steam generator Cooling circuit

5 Turbines Steam pipes

6

7 03:58 AM, March 28, 1979 Routine repair of clogged filter Trace of water left inside air circuit

8 Alarms went off …causing confusion in the operating room

9 Computers interpreted “water in air system” as “dangerous invader” and shut down the pumps in cooling circuit

10 Steam generator Nuclear reactor Pumps in cooling circuit shut down Why: Misinterpreted information from the air circuit Consequence: Cooling system disabled, reactor heating up

11 Operators faced situation they were not trained for and was not covered by their procedures

12 Pressure is building up in primary cooling system within the reactor

13 Computer ordered cadmium rods to plunge into the reactor and stopped chain reaction Pressure and heat within the reactor continues to build

14 Pressure operator relief valve (P.O.R.V.) opened to vent the pressure and failed to reclose

15 Indicator incorrectly shows that the valve has now reclosed Design: Indicator correctly shows that the control order has been “sent”, not “obeyed”

16 Surrogate endpoint In clinical trials, a surrogate endpoint (or marker) is a measure of effect of a specific treatment that may correlate with a real clinical endpoint but does not necessarily have a guaranteed relationship. Disease Surrogate endpoint True clinical endpoint Intervention

17 Valve open, reactor coolant leaking out without anyone’s knowledge

18 Stuck valve undetected Why: Misinterpreted information from indicators Consequence: cooling system disabled, reactor heating up

19 Communication failure Single phone line in the operating room Key people unable to get through 1

20 Overheated reactor Incorrect readings from instruments Volume of coolant measured indirectly Decision: to turn off reactor pumps  Incorrect conclusions Loss of trust in the instruments System did not behave as expected OBSERVE ORIENT DECIDE

21 Training Decisions based on incorrect, misleading or no information

22 7:15 AM stuck valve finally discovered Pumps were finally restarted  Reactor still overheating  Misleading temperature readings Why: Instruments not designed for temperatures this high Radiation in operating room  Mounting pressure

23 8:33 AM – General emergency Misleading and deceptive information provided to the public by the company Minimum information provided to state administration and regulators Partial evacuation within 5 mile radius

24 Freedom of speech, anyone? U.S. v. Caronia Amarin v. FDA Not an issue during TMI Real concern now in off-label promotion (Must be “truthful”)

25 Reactor cracked Sample of contaminated coolant

26 Basement full of contaminated water Radioactive gas was eventually released into the atmosphere

27 Oh, BTW, it can blow up because of accumulated hydrogen Fierce dispute within the NRC whether this can happen or not This risk did not materialize. Partial reactor meltdown did not result in any additional release of radiation.

28 Complex combination of minor equipment failures and major inappropriate human actions

29 Risk = Probability x Consequence 1973 oil crisis Cheap power needed Political topic Gov’t subsidies Probability assessment flawed “That can’t happen here” mindset

30 Root cause: Human factors Combination of lesser events Misjudged probability Misinformation Confusion Inadequate training Inappropriate human response Ordinary mistakes in high stakes environment Probability Consequence X X X ? ? ? ?

31 The need for change RECOMMENDATIONS Fundamental changes in Organizational procedures and practices The attitude of regulators Operator training, updates Emergency response and planning Organization failed to learn from previous failures

32 The need for change RECOMMENDATIONS More attention to human factors Combination of lesser events (slower to develop, more likely) Training, fitness for duty Organizational structure Communication Focus on equipment safety and large break accidents

33 Compliance v. Safety culture “It is the responsibility of the NRC to issue regulations to assure the safety of nuclear power plants. However, regulations alone cannot assure safety. Once regulations become as voluminous and complex as those now in place, they can serve as a negative factor in nuclear safety. The complexity requires immense efforts to assure compliance. Requirement v. Consequence The satisfaction of regulatory requirements is equated with safety.

34 Focus on compliance with regulations instead of intrinsic system safety Inspection manual voluminous and complex – unclear to many inspectors Enforcement actions limited/unused Reliance on industry own data No systematic evaluation of patterns Unclear Roles & Responsibilities The role of regulators

35 NRC has erred on the side of the industry's convenience rather than its primary mission of assuring safety The role of regulators HUMAN FACTORS Fiduciary responsibilities of public servants

36 Worst problem: Loss of public trust Misinformation Deception Misunderstanding Fear & Confusion

37 Outcome Transformation of the industry Major regulatory reform

38 Chernobyl, Ukraine, USSR (1986)

39 Orders received to carry out tests to find out how much energy can be saved during routine maintenance shut down. Numerous safety mechanisms had to be turned off to make this test possible

40 Power levels lowered to perform tests Emergency core cooling system shut off Operator failed to program computer to prevent power from dropping below minimum safe level Automatic scram devices an alarms switched off Control rods withdrawn too far to reinsert quickly = Bad idea = Very bad idea

41 Chernobyl nuclear plant, unit 4 April 26, 1986 5 AM 1:23 AM

42 Study into systemic factors

43 Systemic factors Long record of sometimes fatal accidents  ACCIDENT WAITING TO HAPPEN National 5-year production goals oblivious to reality Training often suspect and shoddy Lax observance of rules and regulations Causes of disaster Irresponsibility Negligence Indiscipline Flawed performance metrics HUMAN FACTORS

44 Outcome Sweeping changes in Soviet society Disintegration of the Empire due to loss of credibility

45 Volkswagen, Germany (2015)

46 Martin Winterkorn, CEO of Volkswagen, AG acknowledged that 11 million vehicles were equipped with diesel engines with defeat devices to cheat pollution tests

47

48 …And spreading

49 Criminal probe underway

50 Root cause Cause entirely internal Flawed performance metrics VW very sensitive to its own image Internal pressures to improve metrics caused someone to manipulate the system – in a manner that amounted to conspiracy

51 Lessons learned? Behavior of organizations follows the same principles regardless industry

52 Common attributes Formally regulated industries High-stakes, high consequence environment Information flow within organization Communication with stakeholders Public trust essential TMIVWChernobyl Accident caused by systemic factors impacts the whole industry

53 Common root cause? Requirement v. Consequence Individual and collective accountability Poor leadership Flawed performance metrics Failure to learn from previous errors Communication with stakeholders / public Regulatory response Delivery/enforcement of regulation TMIVWChernobyl HUMAN FACTORS

54 Regulators and elected officials Subject to the same human frailties Oblivious to ambiguity Requirement v. Consequence Public trust essential TMIVWChernobyl HUMAN FACTORS

55 Experience, training, education CapabilitiesDemographicsFrailtiesValues Organizational culture Reporting structure Leadership Ethical

56 What is risk? Probability of detrimental consequence Risk Vulnerability in process Probability Threat Capability Intent / Ability Detrimental consequence Accidental Malicious

57 Qualifying consequence Safety signal: It takes significant number of casualties with attributed causal relationship to produce a signal Statistically significant cause attributed to a drug Patient injury Qualifying consequence

58 Attribution Dispensing error / incorrect substitution) Non-compliance with treatment Self-medication (OTC, Rx, illicit) Atypical manifestation of disease Misdiagnosis Prescribing error Wrong dose (predictable, unpredictable) Individual variability in response Misleading information on drug Drug interactions (known, unknown) Off-label use (appropriate, inappropriate) Counterfeit medications Limitation of science Honest mistake Omission Commission Deception False Claim PATIENT INJURY

59 Adverse outcome: Consequences Patient Clinician Pharmacist Regulator Drug manufacturer Healthcare facility Insurer Elected officials Individual Population COMMON CONSEQUENCE The only way how to change behavior of organizations is… …to create

60 Detection of vulnerabilitites Probability of detrimental consequence Risk Vulnerability in process Probability Threat Capability Intent / ability Detrimental Consequence Accidental Malicious

61 Quality risk management Record of past events (EV, FAERS) FTA, FMEA, FMECA HAZOP, HACCP, PHA Systems modeling Identify Vulnerabilities Impose safety Constraints Enforce these constraints By Design By Operations Risk assessment ICH Q9 ICH E2E Define Accountability for control of vulnerabilities and acting upon them (R&R) Enable decision-makers

62 Systems theoretic accident process and modeling (STAMP) Imposing constraints on a system whilst ensuring enforceability of these constraints by design and operations Human supervisor (Controller) Model of process Model of automation Automated Controller Model of process Model of interfaces Controlled process Sensor Displays Controls Process inputs Disturbances Process outputs Actuators Controlled variables Measurable variables

63 System models Simplified models of complex environment Tools to enable decision-makers Reduce ambiguity and uncertainty Accountability for acting upon vulnerabilities Limit liability HUMAN FACTORS Tools do not substitute good leadership PUBLIC TRUST

64 Training Correct input – accurate and timely orientation

65 Download presentation Thank you


Download ppt "Arete-Zoe: Lessons from TMI Lessons from big industrial accidents and their relevance for the Pharma industry."

Similar presentations


Ads by Google