Presentation is loading. Please wait.

Presentation is loading. Please wait.

Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different.

Similar presentations


Presentation on theme: "Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different."— Presentation transcript:

1 Science, Values and Risk RD October 2001

2 “It is by no means uncommon to find decision makers interpreting the same scientific information in different ways in different countries.” (Jasanoff, 1991, p.29)

3 Cultural Variation U.S. Environmental Regulators more highly value formal analytical methods (testable validity) than do their European counterparts. US regulators tend to address scientific uncertainty through quantitative analysis. U.S. Environmental Regulators more highly value formal analytical methods (testable validity) than do their European counterparts. US regulators tend to address scientific uncertainty through quantitative analysis. Result: Evidence sufficient to trigger action in one country may not do so in another. Result: Evidence sufficient to trigger action in one country may not do so in another.

4 The Problem with Policy-Relevant Science When knowledge is uncertain or ambiguous facts alone are inadequate to compel a choice. When knowledge is uncertain or ambiguous facts alone are inadequate to compel a choice. Policymakers inevitably look beyond just the science and blend scientific and policy considerations together in their preferred reading of the evidence. Policymakers inevitably look beyond just the science and blend scientific and policy considerations together in their preferred reading of the evidence.

5 Risk Assessment Different risk assessment methodologies can produce widely varying risk estimates. Different risk assessment methodologies can produce widely varying risk estimates. Can animal data be extrapolated to humans? Can animal data be extrapolated to humans? Do policy makers hide behind the numbers? Do policy makers hide behind the numbers? Most lay persons don’t understand quantitative risk assessments. Most lay persons don’t understand quantitative risk assessments.

6 Value judgments and uncertainties in risk assessments may not be stated by the experts. Value judgments and uncertainties in risk assessments may not be stated by the experts. Risks of less than one in a million are often considered negligible from a regulatory standpoint. Risks of less than one in a million are often considered negligible from a regulatory standpoint.

7 Judgmental Probability Encoding Field of US health risk assessment. Field of US health risk assessment. Attempts to ascertain the range of scientific expert opinion on a particular risk as well as the levels of confidence attached to each of those judgments. (e.g. ambient air quality standards) Attempts to ascertain the range of scientific expert opinion on a particular risk as well as the levels of confidence attached to each of those judgments. (e.g. ambient air quality standards) Has proven to be problematic (e.g. biased selection of experts). Has proven to be problematic (e.g. biased selection of experts).

8 British Approach Multi-stakeholder commissions with noted academics and major interest groups. Collective credibility. Multi-stakeholder commissions with noted academics and major interest groups. Collective credibility. Unlike US approach, risk assessment and risk management are examined together. (science and policy) Unlike US approach, risk assessment and risk management are examined together. (science and policy)

9 With respect to lead and the risk to children’s health, they were equivocal in their findings and reported no persuasive evidence of a risk. With respect to lead and the risk to children’s health, they were equivocal in their findings and reported no persuasive evidence of a risk. Described the risk in qualitative (“small”) rather than numerical terms. Described the risk in qualitative (“small”) rather than numerical terms. Yet they recommended that lead additives be phased out of gasoline. Yet they recommended that lead additives be phased out of gasoline. Interpreted the Precautionary Principle as: “dangerous until proven safe”. Dealing with uncertainty. Interpreted the Precautionary Principle as: “dangerous until proven safe”. Dealing with uncertainty.

10 USA vs Britain: Administrative and Political Cultures Regulatory processes: Regulatory processes: –Britain – consensual, non-litigious, relatively closed. –USA – adversarial, litigious, open. USA – regulatory process more open to political pressures. Quantitative analysis becomes a “lifeline to legitimacy”. USA – regulatory process more open to political pressures. Quantitative analysis becomes a “lifeline to legitimacy”.

11 Slovic Article “the goal of informing the public about risk issues – which in principle seems easy to attain – is surprisingly difficult to accomplish.” “the goal of informing the public about risk issues – which in principle seems easy to attain – is surprisingly difficult to accomplish.” Why? Why?

12 Three Categories of Reasons Limitations of risk assessment. Limitations of risk assessment. Limitations of public understanding. Limitations of public understanding. The problems of communicating complex technical information. The problems of communicating complex technical information.

13 Limitations of Public Understanding The public’s perceptions of risk are sometimes inaccurate. The public’s perceptions of risk are sometimes inaccurate. –Memorable past events –Imaginability of future events –Media coverage can influence –Overestimate dramatic causes of death.

14 How good are the public at estimating risks? Rare causes of death tend to be overestimated while common causes are underestimated. Rare causes of death tend to be overestimated while common causes are underestimated. Example: Most people think their chances of dying of a heart attack is about 1 in 20. The truth is closer to 1 in 4. Example: Most people think their chances of dying of a heart attack is about 1 in 20. The truth is closer to 1 in 4. Judgmental bias - people’s predilection for exaggerating their personal immunity from many hazards. “Optimistic bias”. Judgmental bias - people’s predilection for exaggerating their personal immunity from many hazards. “Optimistic bias”.

15 Risk information may frighten and frustrate the public. Risk information may frighten and frustrate the public. –Simply mentioning a risk may enhance perceptions of danger. –Even neutral information may elevate fears (e.g. transmission lines) –People may try to reduce their anxiety about a hazard and its uncertainty by denying its existence or in their minds making the risk smaller than it is.

16 Strong beliefs are hard to modify. Strong beliefs are hard to modify. “strong beliefs about risks, once formed, change very slowly and are extraordinarily persistent in the face of contrary evidence”. Vincent Covello People gravitate or tend to accept evidence that supports their pre-existing beliefs on the subject.

17 When people lack strong opinions they can be easily manipulated by presentation format. When people lack strong opinions they can be easily manipulated by presentation format. –“framing effects” –Ethical issues

18 Expert versus Lay Conceptions of Risk Risk experts employ a technical evaluation of risk: Risk experts employ a technical evaluation of risk: Risk = Probability x Consequences The public applies a broader conception of risk that also incorporates: accountability, economics, values, and trust. The public applies a broader conception of risk that also incorporates: accountability, economics, values, and trust.

19 As our technical control has increased in the technological age, our social control has decreased. As our technical control has increased in the technological age, our social control has decreased. “Most citizens’ calls for ‘scientific’ decisions, in reality, are a request for something a bit broader ---in most cases, a call for ways of assuring that ‘the human element’ of societal decision making will be not just technically competent, but equitable, fair, and responsive to deeply felt concerns” Freudenburg “Most citizens’ calls for ‘scientific’ decisions, in reality, are a request for something a bit broader ---in most cases, a call for ways of assuring that ‘the human element’ of societal decision making will be not just technically competent, but equitable, fair, and responsive to deeply felt concerns” Freudenburg

20 Should Zero-risk be the goal? As Harvard professor John Graham has said, “We all want zero risk. The problem is if every citizen in this country demands zero risk, we’re going to bankrupt the country”.

21 Perceptual cues (e.g. odor) may signal more ominous events. Perceptual cues (e.g. odor) may signal more ominous events. Risk as a ‘collective construct’ - cultural theory of risk. Risk as a ‘collective construct’ - cultural theory of risk. Studies have found cross-national differences in risk judgments. Studies have found cross-national differences in risk judgments. Value orientation influences risk perceptions as do worldviews. Value orientation influences risk perceptions as do worldviews.

22 The Mad Cow Crisis In March 1996, the British government announced that scientists had linked Creutzfeldt-Jakob disease with the human consumption of cattle with bovine spongiform encephalopathy (BSE) or “mad cow disease”. In March 1996, the British government announced that scientists had linked Creutzfeldt-Jakob disease with the human consumption of cattle with bovine spongiform encephalopathy (BSE) or “mad cow disease”.

23 For almost a decade British authorities had insisted there was no risk of BSE being transferred to humans. For almost a decade British authorities had insisted there was no risk of BSE being transferred to humans. With the March 1996 announcement, the British beef market collapsed virtually overnite. With the March 1996 announcement, the British beef market collapsed virtually overnite. The EU banned the export of British beef. The EU banned the export of British beef. Consumption of all beef in countries such as France, Germany and Japan dropped significantly. Consumption of all beef in countries such as France, Germany and Japan dropped significantly.

24 The scientific question at the heart of the BSE crisis: Can humans develop CJD after eating beef from cattle infected with BSE? In other words, can the infectious agent jump the species barrier?

25 Public Perception That the British government was more interested in propping up the beef industry rather than admitting that there may be a risk, however small that risk might be. That the British government was more interested in propping up the beef industry rather than admitting that there may be a risk, however small that risk might be. People stopped buying beef because they no longer trusted the government. People stopped buying beef because they no longer trusted the government.

26 Risk Characteristics of the Mad-cow Disease Crisis. High level of dread of the disease. High level of dread of the disease. Scientific uncertainty. Scientific uncertainty. Possible involvement of children. Possible involvement of children. Catastrophic potential. Catastrophic potential. Non-voluntary exposure. Non-voluntary exposure. Lack of trust in decision-makers. Lack of trust in decision-makers. A history of food safety controversies. A history of food safety controversies.

27 What mistakes did the British Government make in handling the issue of mad cow disease ?

28 What lessons can be learned from the mad cow crisis?


Download ppt "Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different."

Similar presentations


Ads by Google