Presentation Author, 2006 Research Ethics in the 2.0 Era: Conceptual Gaps for Ethicists, Researchers, IRBs Michael Zimmer, PhD School of Information Studies.

Slides:



Advertisements
Similar presentations
Allyn & Bacon 2003 Social Work Research Methods: Qualitative and Quantitative Approaches Topic 5: Ethics and Politics in Social Work Research.
Advertisements

Hart District Acceptable Use Policy Acceptable Use Policy.
Trust me, I‘m a researcher: Experiences of archiving data Bogusia Temple, B.A., MPhil., PhD Professor of Health and Social Care Research.
 Internet Research Ethics Issues and Challenges Presentation for Department of Energy Human Subjects Working Group Workshop December 1, 2011 Michael Zimmer,
Unintended Consequences of Data Sharing Laws and Rules Sam Weber Software Engineering Institute, CMU.
Privacy: Facebook, Twitter
1 Working with Social Media in Research Settings Victoria Wade Careers Consultant.
Making Sense of the Social World 4th Edition
CBR Faculty Fellows Program Presented by: Brenda Marsteller Kowalewski September 16, 2009.
Behavioral Research Chapter Three Ethical Research.
Students’ online profiles for employability and community Frances Chetwynd, Karen Kear, Helen Jefferis and John Woodthorpe The Open University.
Seoul Foreign School IB Extended Essay Research Ethics.
SOC 531: Community Organization Participant Observation.
Ethics in a Computing Culture
1 Arja Kuula, Development Manager, Finnish Social Science Data Archive, University of Tampere Ethics Review in Finland IASSIST conference 2010 Cornell.
Research Ethics Levels of Measurement. Ethical Issues Include: Anonymity – researcher does not know who participated or is not able to match the response.
August 15 click! 1 Basics Kitsap Regional Library.
Scams and Schemes. Today’s Objective I can understand what identity theft is and why it is important to guard against it, I can recognize strategies that.
PROF. CHRISTINE MILLIGAN SCHOOL OF HEALTH AND MEDICINE LANCASTER UNIVERSITY Ethics and Ethical Practice in Research.
Is this Research? Exempt? Expedited?
Just Culture Assessing Readiness – Focus on Process Jill Hanson Certified Just Culture™ Champion WHA 1.
Smart Phones, Social Media, & Your Kids Image Courtesy of: November Steve.
Chapter 3 Researching the Social World Copyright 2012, SAGE Publications, Inc.
Social Networking and On-Line Communities: Classification and Research Trends Maria Ioannidou, Eugenia Raptotasiou, Ioannis Anagnostopoulos.
1 Writing a research proposal Jan Illing, Jan Illing, Gill Morrow and Charlotte Kergon Gill Morrow and Charlotte Kergon.
Information Systems Security Computer System Life Cycle Security.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Social Media Impact on Employment October 2, 2013 Human Resource Services.
Ethics Last Update Copyright Kenneth M. Chipps Ph.D
Ethics, Technology, and Qualitative Research: Thinking through the Implications of New Technology Sandra Spickard Prettyman Kristi Jackson.
6.3 Ethics in Statistics. Minimizing Risk vs. Maximizing Info To test a new surgical practice, should you account for the placebo effect by performing.
Tippecanoe 4-H Computer Project Mikel BergerBret Madsen Ed Evans
Jim Fay and David Funk – Tracy and Gyseka
Do you believe in this? Due to its very nature, the Internet is NOT a safe or secure environment. It is an ever-changing medium where anyone and everyone.
Privacy and Confidentiality. Definitions n Privacy - having control over the extent, timing, and circumstances of sharing oneself (physically, behaviorally,
Establishing boundaries where there appear to be none.
IRB BASICS: Issues in Ethics and Human Subject Protections Prepared by Ed Merrill Department of Psychology November 12, 2009.
Making Sense of the Social World 4th Edition
Introducing Communication Research 2e © 2014 SAGE Publications Chapter Three Ethics: What Are My Responsibilities as a Researcher?
Introducing Communication Research 2e © 2014 SAGE Publications Chapter Three Ethics: What Are My Responsibilities as a Researcher?
Igniting 21st century learning ® ® © One-to-One Institute 1 Teaching & Learning in a One-to-One Environment 1 Muskegon August 16,17,18.
Information Literacy Module for Majors Available to support any department Tony Penny, Research Librarian – Goddard Library Supporting the Architecture.
Charnelle Bacon & Brandon Carr. Benefits of a Social Web Share Create Connect  The social web is a place that one can share a multiplex of information,
Case Studies: Puzzles in Human Research Kevin L. Nellis, M.S., M.T. (A.S.C.P.) Program Analyst, Program for Research Integrity Development and Education.
Privacy & Confidentiality in Internet Research Jeffrey M. Cohen, Ph.D. Associate Dean, Responsible Conduct of Research Weill Medical College of Cornell.
Nov 26, Health-y sharing of human data. 2 Plan ahead.. It can be done in many cases, to great success and benefit!
Ethics: Doing the Right Thing
3. Ethics in Research  What are some of the concern guiding ethical research?  What are the potential psychological threat to participants in behavioral.
Patti Fowler Internet Safety Education Coordinator SC ICAC Task Force Office of the Attorney General Internet Safety.
THE INSTITUTIONAL REVIEW BOARD. WHAT IS AN IRB? An IRB is committee set up by an institution to review, approve, and regulate research conducted under.
Sociology. Sociology is a science because it uses the same techniques as other sciences Explaining social phenomena is what sociological theory is all.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
McGraw-Hill © 2007 The McGraw-Hill Companies, Inc. All rights reserved. Slide 1 Sociological Research SOCIOLOGY Richard T. Schaefer 2.
Creating a Culture of Privacy Michael Kaiser Executive Director National Cyber Security
Ethics. The branch of philosophy that involves systematizing, defending, and recommending concepts of right and wrong conduct Moral principles that govern.
AMAZON WEB SERVICES User Agreement Summary. The Services Free Services: These are the services we intend on using. They are collectively called the Amazon.
Online Reputation A guide for children aged 7+
Creating your online identity
Student Privacy in an Ever-Changing Digital World
Researching the Social World
Freedom Independence Transition
Being CyberSmart! About Online Safety and Security At Schools
E-safety – for teachers
Sociological Research
Anti bullying Week VMG What does bullying mean to you?
Lecture 27: Privacy CS /7/2018.
Ethics Review Morals: Rules that define what is right and wrong Ethics: process of examining moral standards and looking at how we should interpret and.
In short, yes, you are. Data is everywhere, whether we recognise it or not, and it can be qualitative (eg. words or photos) or quantitative (eg. spreadsheets.
E-safety – for teachers
Institutional Review Board
Presentation transcript:

Presentation Author, 2006 Research Ethics in the 2.0 Era: Conceptual Gaps for Ethicists, Researchers, IRBs Michael Zimmer, PhD School of Information Studies University of Wisconsin-Milwaukee Secretary’s Advisory Committee on Human Research Protections July 21, 2010

My Perspective Approaching the problem of “The Internet in Human Subjects Research” from the field of information ethics Focus on how 2.0 tools, environments, and experiences are creating new conceptual gaps in our understanding of: –Privacy –Anonymity vs. Identifiability –Consent –Harm

Illuminating Cases 1.Tastes, Ties, and Time (T3) Facebook data release 2.Pete Warden’s harvesting (and proposed release) of public Facebook profiles 3.Question of consent for using “public” Twitter streams 4.Library of Congress archiving “public” Twitter streams

T3 Facebook Project Tastes, Ties, and Time research project sought to understand social network dynamics of large groups of students Solution: Work with Facebook & an “anonymous” University to harvest the Facebook profiles of an entire cohort of college freshmen –Repeat each year for their 4-year tenure –Co-mingle with other University data (housing, major, etc) –Coded for race, gender, political views, cultural tastes, etc

T3 Data Release As an NSF-funded project, the dataset was made publicly available –First phase released September 25, 2008 –One year of data (n=1,640) –Prospective users must submit application to gain access to dataset –Detailed codebook available for anyone to access

“Anonymity” of the T3 Dataset “All the data is cleaned so you can’t connect anyone to an identity” But dataset had unique cases (based on codebook) If we could identify the source university, individuals could potentially be identified Took me minimal effort to discern the source was Harvard The anonymity and privacy of subjects in the study becomes jeopardized

T3 Good-Faith Efforts to Protect Subject Privacy 1.Only those data that were accessible by default by each RA were collected 2.Removing/encoding of “identifying” information 3.Tastes & interests (“cultural footprints”) will only be released after “substantial delay” 4.To download, must agree to “Terms and Conditions of Use” statement 5.Reviewed & approved by Harvard’s IRB

1. Only those data that were accessible by default by each RA were collected “We have not accessed any information not otherwise available on Facebook” False assumption that because the RA could access the profile, it was “publicly available” RAs were Harvard graduate students, and thus part of the the “Harvard network” on Facebook

2. Removing/encoding of “identifying” information “All identifying information was deleted or encoded immediately after the data were downloaded” While names, birthdates, and s were removed… Various other potentially “identifying” information remained –Ethnicity, home country/state, major, etc AOL/NetFlix cases taught us how nearly any data could be potentially “identifying”

3. Tastes & interests will only be released after “substantial delay” T3 researchers recognize the unique nature of the cultural taste labels: “cultural fingerprints” Individuals might be uniquely identified by what they list as a favorite book, movie, restaurant, etc. Steps taken to mitigate this privacy risk: –In initial release, cultural taste labels assigned random numbers –Actual labels to be released after a “substantial delay”, in 2011

3. Tastes & interests will only be released after “substantial delay” But, is 3 years really a “substantial delay”? –Subjects’ privacy expectations don’t expire after artificially-imposed timeframe –Datasets like these are often used years after their initial release, so the delay is largely irrelevant T3 researchers also will provide immediate access on a “case-by-case” basis –No details given, but seemingly contradicts any stated concern over protecting subject privacy

4. “Terms and Conditions of Use” statement 3.I will use the dataset solely for statistical analysis and reporting of aggregated information, and not for investigation of specific individuals…. 4.I will produce no links…among the data and other datasets that could identify individuals… 5.I will not knowingly divulge any information that could be used to identify individual participants 6.I will make no use of the identity of any person or establishment discovered inadvertently.

4. “Terms and Conditions of Use” statement The language within the TOS clearly acknowledges the privacy implications of the T3 dataset –Might help raise awareness among potential researchers; appease IRB But “click-wrap” agreements are notoriously ineffective to affect behavior Unclear how the T3 researchers specifically intend to monitor or enforce compliance –Already been one research paper that likely violates the TOS

5. Reviewed & Approved by IRB “Our IRB helped quite a bit as well. It is their job to insure that subjects’ rights are respected, and we think we have accomplished this” “The university in question allowed us to do this and Harvard was on board because we don’t actually talk to students, we just accessed their Facebook information”

5. Reviewed & Approved by IRB For the IRB, downloading Facebook profile information seemed less invasive than actually talking with subjects –Did IRB know unique, personal, and potentially identifiable information was present in the dataset? Consent was not needed since the profiles were “freely available” –But RA access to restricted profiles complicates this; did IRB contemplate this? –Is putting information on a social network “consenting” to its use by researchers?

T3 Good-Faith Efforts to Protect Subject Privacy 1.Only those data that were accessible by default by each RA were collected 2.Removing/encoding of “identifying” information 3.Tastes & interests (“cultural footprints”) will only be released after “substantial delay” 4.To download, must agree to “Terms and Conditions of Use” statement 5.Reviewed & approved by Harvard’s IRB

Illuminating Cases 1.Tastes, Ties, and Time (T3) Facebook data release 2.Pete Warden’s harvesting (and proposed release) of public Facebook profiles 3.Question of consent for using “public” Twitter streams 4.Library of Congress archiving “public” Twitter streams

Pete Warden Facebook Dataset Exploited flaw in FB’s architecture to access and harvest public profiles to 215 million users (without needing to login) Impressive analyses at aggregate levels Planned to release entire dataset – with names, locations, etc – to academic community Later destroyed data under threat of lawsuit from Facebook release-profile-data-on-215-million-facebook-users/

Harvesting Public Twitter Streams Is it ethical for researchers to follow and systematically capture public Twitter streams without first obtaining specific, informed consent by the subjects? –Are tweets publications, or utterances? –Are you reading a text, or recording a discussion? –What are users’ expectations to how their tweets are being found & used? twitter-accounts-without-consent/

LOC Archive of Public Tweets Library of Congress will archive all public tweets –6 month delay, restricted access to researchers Open questions: –Can users opt-out from being in permanent archive? –Can users delete tweets from archive? –Will geolocational and other profile data be included? –What about a public tweet that is re- tweeting a private one?

Illuminating Cases 1.Tastes, Ties, and Time (T3) Facebook data release 2.Pete Warden’s harvesting (and proposed release) of public Facebook profiles 3.Question of consent for using “public” Twitter streams 4.Library of Congress archiving “public” Twitter streams

My Perspective Approaching the problem of “The Internet in Human Subjects Research” from the field of information ethics Focus on how 2.0 tools, environments, and experiences are creating new conceptual gaps in our understanding of: –Privacy –Anonymity vs. Identifiability –Consent –Harm

Conceptual Gaps Privacy –Presumption that because subjects make information available on Facebook/Twitter, they don’t have an expectation of privacy –Ignores contextual nature of sharing –Ignores whether users really understand their privacy settings Anonymity vs. Identifiability –Presumption that stripping names & other obvious identifiers provides anonymity –Ignores how anything can identifiable and become the “missing link” to re- identify an entire dataset

Conceptual Gaps Consent –Presumption that because something is made visible on Facebook/Twitter the subject is consenting to it being harvested for research –Ignores how research method might allow un-anticipated access to data meant to be restricted Harm –Researchers imply “already public, what harm could happen” –Ignores dignity & autonomy, let alone unanticipated consequences

Filling the Conceptual Gaps Privacy –Recognize the strict dichotomy of public/private doesn’t apply in the 2.0 world (if it does anywhere) –Consider Nissenbaum’s theory of “contextual integrity” Privacy in Context (2009, Stanford University Press) –Should strive to consult privacy scholars on projects & reviews

Filling the Conceptual Gaps Anonymity & Identifiability –Recognize “personally identifiable information” is an imperfect concept Consider EU approach of “potentially linkable” to an identity –“Anonymous” datasets are not fully achievable and provides false sense of protection Paul Ohm, “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization”

Filling the Conceptual Gaps Consent –What do we mean by “consent” when it comes to using “publicly” available content –Must recognize that a user making something public online comes with a set of assumptions about who can access and how – that’s what is being consented to (implicitly or explicitly) –…

Filling the Conceptual Gaps Harm –Must move beyond the traditional US focus of harm as requiring a tangible (financial?) consequence Protecting from harm is more than protecting from hackers, spammers, identity thieves, etc –Consider dignity/autonomy based theories of harm Must a “wrong” occur for there to be damage to the subject? Do subjects deserve control over the use of their data streams?

Now What…. Researchers and IRBs believe they’re doing the right thing (and usually, they are) Bring together researchers, IRB members, ethicists & technologists to identify and resolve these conceptual gaps –InternetResearchEthics.org –Digital Media & Learning collaboration –Today’s panel…

Presentation Author, 2006 Research Ethics in the 2.0 Era: Conceptual Gaps for Ethicists, Researchers, IRBs Michael Zimmer, PhD School of Information Studies University of Wisconsin-Milwaukee Secretary’s Advisory Committee on Human Research Protections July 21, 2010