Living with High-Risk Systems Michael S. Tashbook Department of Computer Science University of Virginia September 23, 2002.

Slides:



Advertisements
Similar presentations
Research Methodology Chapter 1.
Advertisements

HUMAN DIMENSIONS OF WILDLIFE MANAGEMENT (HDWM) M. Nils Peterson and Shari L. Rodriguez Fisheries, Wildlife & Conservation Biology Program Department of.
Intelligence Give a definition of intelligence that you could defend, explaining why you believe you could defend it. Give examples of ways your definition.
Testing Theories: Three Reasons Why Data Might not Match the Theory.
Copyright © 2012 Pearson Education Chapter 5 Individual Perception and Decision- Making 5-1 Essentials of Organizational Behavior, 11/e Global Edition.
Validity In our last class, we began to discuss some of the ways in which we can assess the quality of our measurements. We discussed the concept of reliability.
Information Technology and Decision Making
Philosophy 223 Relativism and Egoism. Remember This Slide? Ethical reflection on the dictates of morality can address these sorts of issues in at least.
Perception and Individual Decision Making
What Is Perception, and Why Is It Important?
Chapter 7: Risk, Safety and Liability in Engineering
The Nature of Managerial Decision Making
Marakas: Decision Support Systems, 2nd Edition © 2003, Prentice-Hall Chapter Chapter 4: Modeling Decision Processes Decision Support Systems in the.
ORGANIZATIONAL BEHAVIOR
The Nature of Managerial Decision Making
ECON 6012 Cost Benefit Analysis Memorial University of Newfoundland
Introduction to Economics
Theory testing Part of what differentiates science from non-science is the process of theory testing. When a theory has been articulated carefully, it.
Copyright ©2011 Pearson Education
Science What is “Safety” Freedom from danger Safety is the condition of being protected against failure, breakage, error, accidents, or harm. (Protection.
Copyright © 2005 by South-Western, a division of Thomson Learning All rights reserved 1 Chapter 8 Fundamentals of Decision Making.
Testing Theories: Three Reasons Why Data Might not Match the Theory Psych 437.
Chapter 1: An Introduction to Consumer Behavior
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 7-1 Defining Competitiveness Chapter 7.
Risk, Probability and Judgment. The Harnessed AtomRisk, Probability, and Judgment 2 Today’s Topics What is risk? How do we perceive risk? How do we measure.
Learning Progressions: Some Thoughts About What we do With and About Them Jim Pellegrino University of Illinois at Chicago.
CHAPTER 6 The Economic Role of the State PUBLIC SECTOR ECONOMICS: The Role of Government in the American Economy Randall Holcombe.
What is Economics? Chapter 18.
CONCEPTIONS OF COMPLEXITY AND IMPLICATIONS FOR ECONOMICS Stuart A. Umpleby The George Washington University Washington, DC.
The Perception of Risk By: Cleo Arnold, Stefanie Galich, & Allison Mitrovich Nickerson, R. S. (2003). Risk and the psychology of prevention, In Psychology.
Decision making, FUIEMS, 29 December, Decision-Making Process Engineering Economics Lecture # 15.
Perception of Risk Posed by Extreme Events Risk Management Strategies in an Uncertain World Kim B. Staking.
Chapter 16 Problem Solving and Decision Making. Objectives After reading the chapter and reviewing the materials presented the students will be able to:
PREPARED BY MS. ROSITA ARMAN MICHAEL ANNIAH MBA IN STRATEGIC MANAGEMENT (UTM) BA. ESTATE MANAGEMENT (UITM)
SINTEF Telecom and Informatics EuroSPI’99 Workshop on Data Analysis Popular Pitfalls of Data Analysis Tore Dybå, M.Sc. Research Scientist, SINTEF.
WHAT IS THE NATURE OF SCIENCE?. SCIENTIFIC WORLD VIEW 1.The Universe Is Understandable. 2.The Universe Is a Vast Single System In Which the Basic Rules.
Applications in Acquisition Decision-Making Process.
Thomson South-Western Wagner & Hollenbeck 5e 1 Chapter Sixteen Critical Thinking And Continuous Learning.
Uncertainty Management in Rule-based Expert Systems
Chapter 6 DECISION MAKING: THE ESSENCE OF THE MANAGER’S JOB 6.1 © 2003 Pearson Education Canada Inc.
Introduction Chapter 1 and 2 Slides From Research Methods for Business
The financial costs and benefits of alcohol The financial costs and benefits of alcohol Christine Godfrey Department of Health Sciences & Centre for Health.
April 29th, Chapter 6 Decision Making 6.1 The Nature of Management decisions 6.1 The Nature of Management decisions 6.2The decision making process.
Managing Portfolio for Individual Investors Jakub Karnowski, CFA Portfolio Management for Financial Advisers.
Research Methods Chapter 2.
Objectives of the Session By the end of this session, it will be hoped to achieve the following objectives;  To understand the nature and scope of managerial.
Introduction to Economics
Information and Decision Making
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 7-1 Defining Competitiveness Chapter 7.
7-1 © 2006 The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin The Nature of Managerial Decision Making Decision Making  The process.
Economist or Econo-Mystery? Economists maintain a very focused perspective on human activity and as such are very analytical, preferring to use mathematics.
Chapter 7: Learning and Decision Making Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
RISK PERCEPTION The Psychology of Risk
Models of Foreign Policy Decision Making PO400 Unit 7.
Copyright ©2016 Pearson Education, Inc. 5-1 Essentials of Organizational Behavior 13e Stephen P. Robbins & Timothy A. Judge Chapter 5 Personality and Values.
Introduction to Economics What do you think of when you think of economics?
McGraw-Hill/Irwin Copyright  2006 by The McGraw-Hill Companies, Inc. All rights reserved. ECONOMICS AND ECONOMIC REASONING Chapter 1.
Approaches to quantifying uncertainty-related risk There are three approaches to dealing with financial and economic risk in benefit-cost analysis: = expected.
The Individual, The Government, and Mixed Markets Limited Government.
Lecture 8 Social Regulation. Correction of market failures, not dealing with the natural monopoly problem Regulation of health, safety, environment, public.
Decision Making We could use two films here, so we want lots of extra time. What to cut out? Dangerous minds is good hopefully for expectancy and equity.
Introduction to Economics
How is knowledge gained in the human sciences?
Research & Writing in CJ
Rational Perspectives on Decision Making Keys to Decision Making
Decision Making Decision - making a choice from two or more alternatives. Problem - an obstacle that makes it difficult to achieve a desired goal or purpose.
Defining Competitiveness
Strategy Review, Evaluation, and Control
What is Economics?.
Strategy Review, Evaluation, and Control
Presentation transcript:

Living with High-Risk Systems Michael S. Tashbook Department of Computer Science University of Virginia September 23, 2002

9/23/02Living with High-Risk Systems2 Categories of Risk Not all high-risk systems are created equal We can partition the set of high-risk systems into three classes:  Hopeless Cases  Salvageable Systems  Self-Correcting Systems

9/23/02Living with High-Risk Systems 3 Hopeless Cases This category is composed of systems where the (inevitable) risks far outweigh any reasonable benefits These systems should just be abandoned — at least in Perrow’s view Examples:  Nuclear weapons  Nuclear power

9/23/02Living with High-Risk Systems 4 Salvageable Systems Salvageable systems are  systems that we can’t do without, but that can be made less risky with considerable effort, or  systems where the expected benefits are so great that some risks should be run Examples:  Some marine transport  DNA research

9/23/02Living with High-Risk Systems 5 Self-Correcting Systems This category contains systems that are not completely self-correcting, but are self- correcting to some degree Only modest efforts are needed to improve these systems further Examples:  Chemical plants  Airplanes/Air Traffic Control

9/23/02Living with High-Risk Systems 6 Is Abandonment the Answer? Should systems in the “Hopeless Cases” category be abandoned summarily? Should drastic modifications be made for other high-risk systems (namely, those in the “Salvageable” category)? Not necessarily; Perrow’s argument makes several assumptions that may not be true

9/23/02 Living with High-Risk Systems 7 Perrow’s Assumptions  Current risk assessment theory is flawed  The public is adequately equipped to make rational decisions, and its opinions should be respected by policy experts  Organizational changes will have little effect in increasing system safety

9/23/02Living with High-Risk Systems8 1. Risk Assessment Analysis of the risks and benefits offered by new systems — examination of the tradeoffs (if any) Modern risk assessors work to:  inform and advise on the risks and benefits of new systems  legitimate risks and reassure the public  second-guess regulatory agencies’ actions

9/23/02Living with High-Risk Systems 9 How Safe is Safe Enough? More accurately, how do we model risk? Mathematical models are generally used to model risk The problem with this kind of analysis is that it only measures things that can be quantified  How much is your life worth?

9/23/02Living with High-Risk Systems 10 Biased Interpretations Problem of systematic biases and public opinion  Does every death have the same impact?  Is a death from diabetes or cancer as bad as a murder?  The public doesn’t seem to think so.  Are fifty thousand annual highway deaths really equivalent to a single nuclear catastrophe?

9/23/02Living with High-Risk Systems 11 Systematic Biases Risk assessment differentiates between voluntary risks and involuntary risks However, the system doesn’t discriminate between the imposition of risks and the acceptance of risks This dispassionate cost-benefit approach often leads to “the tyranny of the bean- counters”

9/23/02Living with High-Risk Systems 12 Cost-Benefit Analysis (CBA) CBA ignores the distribution of wealth in society  Risk assessments ignore the social class distribution of risks CBA relies heavily on current market prices  Thus, low-paid employees are worth less when risks are considered

9/23/02Living with High-Risk Systems 13 More CBA Assumptions New risks should not be higher than others we have already accepted  if other systems become riskier, we can lower safety levels on new systems Competitive markets require risky endeavors

9/23/02Living with High-Risk Systems 14 More RA/CBA Criticisms RA/CBA does not distinguish between:  Addiction and free choice  Active risks and passive risks  This isn’t just a matter of in/voluntary risk — it’s a question of control Risk assessors would prefer to exclude the public from decisions that affect their interests

9/23/02Living with High-Risk Systems15 2. Decision-Making Risk assessors assert that the public is ill- equipped to make decisions on their own behalf, and cognitive psychologists agree Humans don’t reason well:  We maximize some dangers while minimizing others  We don’t calculate odds “properly”

9/23/02Living with High-Risk Systems 16 Three Types of Rationality Absolute rationality  Risks and benefits are calculated exactly, offering a clear view of what to do Bounded rationality  Employs heuristics to make decisions Social and cultural rationality  Limited rationality has social benefits

9/23/02Living with High-Risk Systems 17 Bounded Rationality People don’t make absolutely rational decisions, possibly due to:  neurological limitations  memory/attention limits  lack of education  lack of training in statistics and probability Instead, we tend to use hunches, rules of thumb, estimates, and guesses

9/23/02Living with High-Risk Systems 18 More on Bounded Rationality “There are two reasons for perfect or deductive rationality to break down under complication. The obvious one is that beyond a certain complicatedness, our logical apparatus ceases to cope—our rationality is bounded. The other is that in interactive situations of complication, agents can not rely upon the other agents they are dealing with to behave under perfect rationality, and so they are forced to guess their behavior. This lands them in a world of subjective beliefs, and subjective beliefs about subjective beliefs. Objective, well-defined, shared assumptions then cease to apply. In turn, rational, deductive reasoning—deriving a conclusion by perfect logical processes from well-defined premises—itself cannot apply. The problem becomes ill- defined.” — W. Brian Arthur, “Inductive Reasoning and Bounded Rationality” (1994)

9/23/02Living with High-Risk Systems 19 The Efficiency of Heuristics Heuristics are useful; they save time, even if they are wrong on occasion Heuristics:  prevent decision-making “paralysis”  drastically reduce search costs  improve (are refined) over time  facilitate social life  work best in loosely-coupled (slack, buffered) environments

9/23/02Living with High-Risk Systems 20 Pitfalls of Heuristics Heuristics rely on the problem context; if this is wrong, then the resulting action will be inappropriate Context definition is subtle and difficult Heuristics are related to intuitions  Intuitions are a form of heuristic  Intuitions may be held even in the face of contrary evidence

9/23/02Living with High-Risk Systems 21 Rationality and TMI The TMI accident occurred shortly after it was put into service Absolute rationality acknowledges that a problem was was bound to happen eventually; it just happened sooner rather than later Is this comparable to the “1x10 -9 standard”?

9/23/02Living with High-Risk Systems 22 Rationality and TMI (cont’d) This may be true, but is it the point? TMI was a new type of system, and no heuristics existed for it at the time Even though problems may be rare, they can be very serious Experts predicted that TMI was unlikely to occur, yet it did; could they have been wrong?

9/23/02Living with High-Risk Systems 23 Bounded Rationality vs. TMI The logic of the public response to TMI was technically faulty; even so, it was efficient and understandable Experts have been wrong before; it’s efficient to question them Bounded rationality is efficient because it avoids extensive effort  Can John Q. Public make a truly informed decision about nuclear power?

9/23/02Living with High-Risk Systems 24 Social and Cultural Rationality Our cognitive limits are a blessing rather than a curse There are two reasons for this:  Individuals vary in their relative cognitive abilities (multiple intelligences theory)  These differences encourage social bonding  Individual limitations or abilities lead to different perspectives on (and solutions to) a given problem

9/23/02Living with High-Risk Systems 25 Risk Assessment Studies Clark University study of experts and the lay public  The two groups disagreed on how to judge the risk of some activities  Disaster potential seemed to explain the discrepancy between perceived and actual risk  For the public, dread/lethality ratings were accurate predictors of risk assessments Subsequent study identified three “factors” (clusters of interrelated judgments)

9/23/02Living with High-Risk Systems 26 Dread Risk Associated with:  lack of control over activity  fatal consequences  high catastrophic potential  reactions of dread  inequitable risk-benefit distribution  belief that risks are not reducible Correlation with interactively complex, tightly-coupled systems

9/23/02Living with High-Risk Systems 27 Unknown Risk This factor includes risks that are:  unknown  unobservable  new  delayed in their manifestation This factor is not conceptually related to interaction and coupling as well as dread risk

9/23/02Living with High-Risk Systems 28 Societal/Personal Exposure This factor measures risks based on:  the number of people exposed  the rater’s personal exposure to the risk in question Of all three factors, dread risk was the best predictor of perceived risk

9/23/02Living with High-Risk Systems 29 Thick vs. Thin Descriptions A “thin description” is quantitative, precise, logically consistent, economical, and value-free A “thick description” recognizes subjective dimensions and cultural values, and expresses a skepticism about human- made systems

9/23/02Living with High-Risk Systems30 3. Organizational Solutions In general, risky enterprises are organizational enterprises Tightly controlled, highly centralized, authoritarian organizations should be put into place to run risky systems and eliminate “operator error” But does this really help things?

9/23/02Living with High-Risk Systems 31 Suggested Organization Types Linear InteractionComplex Interaction Tight Coupling Centralization for tight coupling and linear interactions Centralization for tight coupling. Decentralization for complex interactions These demands are incompatible!!! Loose Coupling Centralization and decentralization are both feasible Decentralization for loose coupling and complex interactions

9/23/02Living with High-Risk Systems 32 Where Does the Problem Lie? Technology?  “[W]e are in the grip of a technological imperative that threatens to wipe out cultural values….” Capitalism?  Private profits lead to short-run concerns  Social costs are borne by everyone Greed?  Private gain versus the public good

9/23/02Living with High-Risk Systems 33 The Problem of Externalities Externalities are the social costs of an activity (pollution, injuries, anxieties) that are not reflected in its price Social costs are often borne by those who receive no benefit from the activity, or who are even unaware of it Systems with identifiable/predictable victims are more likely to consider externalities

9/23/02Living with High-Risk Systems 34 A New Cost-Benefit Analysis How risky are the systems we have been considering, only in terms of catastrophic potential? How costly are the alternative ways (if any) of producing the same outputs?

9/23/02 Living with High-Risk Systems 35 The Final Analysis  Systems are human constructs, whether carefully designed or unplanned emergences  These systems are resistant to change  System catastrophes are warning signals, but not the ones we think  These signals come not from individual errors, but from the systems themselves

Living with High-Risk Systems Michael S. Tashbook Department of Computer Science University of Virginia September 23, 2002