Presentation is loading. Please wait.

Presentation is loading. Please wait.

Dual-Use Research Codes of Conduct: Lessons from the Life Sciences Michael J. Selgelid, PhD Senior Research Fellow Centre for Applied Philosophy and Public.

Similar presentations

Presentation on theme: "Dual-Use Research Codes of Conduct: Lessons from the Life Sciences Michael J. Selgelid, PhD Senior Research Fellow Centre for Applied Philosophy and Public."— Presentation transcript:

1 Dual-Use Research Codes of Conduct: Lessons from the Life Sciences Michael J. Selgelid, PhD Senior Research Fellow Centre for Applied Philosophy and Public Ethics The Australian National University

2 ‘Dual Use’ is Multiuse ‘Dual Use’ has multiple meanings: 1. Civilian and military use (two birds, one stone) 2. Good and bad use 3. Good and bad use, involving weapons (of mass destruction). (Intersection of 1 and 2) Dual Use Dilemma 1. Responsible scientist: should I engage in R&D related to technology X 2. Government: To what extent should R&D of technology X be permitted/controlled?

3 Dual Use Dilemma An (inherently) ethical dilemma  i.e., given concern with values, benefits, harms, duties  Ethicists, however, have played only minor role in debates about dual use

4 Heightened Attention to Dual Use and Need for Codes of Conduct Earlier concerns Experience with nuclear weapons Rotblat called for a Hippocratic Oath for Scientists Growing attention/concern: 11 September 2001; and anthrax attacks Recent episodes involving biotechnology Reviews of BTWC and CWC

5 Lessons from Biomedical Science Long history/experience with codes of conduct: Hippocratic Oath Codes of Medical Associations (e.g., AMA and WMA) Nuremburg Code Declaration of Helsinki  Extremely influential  Largely effective guide to action

6 Lessons from Biomedical Science Mousepox (2001) Polio (2002) 1918 Flu (2005)  Whether or not these studies should have been published, can we not imagine some that should not be? Censorship of nuclear science has been norm for decades. Who should make decisions? Would reliance on scientists (guided by codes of conduct) suffice? Similar questions relate to vetting of research. Relevance to “converging technologies” 1. Intersection of biology, chemistry, and information technology. Synthetic biology likewise involves nanotechnology and engineering. 2. Majority of debate—and policy making—to date has focused on life sciences (and these studies in particular)

7 Policy Aftermath SSS Joint Statement of Editors and Authors Group (2003) NRC’s “Fink Report” (2004), called for: Voluntary self governance r/e censorship Increased education about dual use Development of codes of conduct Establishment of science advisory board NSABB (established 2004) Identification of dual use research “of concern”—experiments of concern Tools for controlling dissemination of information Codes of conduct Means for international collaboration Synthetic biology

8 Codes of Conduct Numerous roles to play Limitations to keep in mind—i.e. they are by no means the whole solution.

9 Why codes of conduct are important Roles: Raising awareness (and thus promoting good conduct—and avoidance of bad outcomes) Winning trust in science enterprise Avoiding over-regulation Process benefits

10 Raising Awareness About weapons conventions About dual use phenomenon, and dual use potential of ones own work About social responsibilities of scientists

11 Social Responsibility of Scientists Common themes: Science is neutral/apolitical/value free Science is impartial pursuit of knowledge and knowledge is inherently good, or Knowledge is neither good nor bad, applications (by others) is what can be good or bad “There are no bad molecules, only evil human beings” (Hoffmann)  Those who employ fruits of science in malevolent manner are guilty. Well-intentioned scientists are innocent.

12 Responsibility for foreseeable outcomes Scientists are implicated in bad outcomes that result from their work. Degree of responsibility depends on extent to which (bad) outcomes are Foreseen Foreseeable  scientists have a responsibility to be aware and/or reflect on the ways in which their work will be used. The failure to reflect—or to foresee the foreseeable— may be considered negligence. In the context of weapons of mass destruction, such negligence could cause grave harm.

13 AMA’s 2005 “Guidelines to Prevent the Malevolent Use of Biomedical Research” “ Biomedical research may generate knowledge with potential for both beneficial and harmful application. Before participating in research, physician-researchers should assess foreseeable ramifications of their research in an effort to balance the promise of benefit from biomedical innovation against potential harms from corrupt application of the findings. In exceptional cases, assessment of the balance of future harms and benefits of research may preclude participation in the research; for instance, when the goals of research are antithetical to the foundations of the medical profession, as with the development of biological or chemical weapons.”

14 AMA’s 2005 “Guidelines to Prevent the Malevolent Use of Biomedical Research” Why shouldn’t similar statement be adopted other relevant sciences? Note importance of going beyond the weapons conventions: Neither the CWC nor the BTWC was designed to address the dual use dilemma. As revealed by general provisions clauses, the conventions’ prohibitions largely turn on the intentions of researchers and/or research programs.

15 Contra idea that awareness of and adherence to BTWC and CWC suffice No one, so far as I am aware, has argued that the mousepox, polio, and flu studies flew in the face of the biological weapons conventions. The concern was that these were potentially dangerous experiments/publications—not that they were (already) prohibited ones.

16 Roles of Codes Winning public trust Social contract Avoiding over-regulation Process benefits (Rappert)

17 Codes of Conduct: Limitations Universal codes— “lack substance”, too general to be action guiding Commonsense lists of things (conscientious) people would do anyway Conflicting principles If too specific/detailed, then less wide applicability and/or code is more controversial/less widely accepted Dangers r/e proliferation of codes (that say different things)—lessons from Helsinki “Codes not effective”—those who would do what codes prohibit are not the kind of people who listen to codes in the first place. Need enforcement mechanisms.

18 Enforcement Sanctions of professional societies r/e membership and/or licensing Enforcement via denial of grants/withdrawal of Federal Funding Defacto legal status—“standards of practice”— negligence (recall Helsinki) Formal legislation (perhaps as part of governmental oversight process) Some elements of code will already have legal implications—i.e., weapons conventions. (Some ask why we then need to embody them in codes—i.e., charge of redundancy. In response: recall value of awareness raising.)

19 Regulation Recall censorship debate (and relevant issues regarding of vetting of research). Who should decide? 1. Government? 2. Individual scientists (and/or relevant scientific bodies) guided by codes of conduct—i.e. “voluntary self-governance”)? Perhaps neither is satisfactory. There are hybrid solutions, for which legally binding codes of conduct may play an important role.

20 Final Conclusions Much to learn from the life sciences Codes of conduct have numerous important roles to play Codes must go beyond weapons conventions to cover dual use research permitted by these conventions Codes of conduct have limits Codes of conduct should be part of a broader “web of prevention” Codes of conduct need not preclude regulation by government (but they may help avoid over-regulation) Codes of conduct may play crucial role in regulatory oversight At least some elements of codes will/should have legal status

Download ppt "Dual-Use Research Codes of Conduct: Lessons from the Life Sciences Michael J. Selgelid, PhD Senior Research Fellow Centre for Applied Philosophy and Public."

Similar presentations

Ads by Google