CMU Usable Privacy and Security Laboratory Hey, That’s Personal! Lorrie Faith Cranor 28 July 2005

Slides:



Advertisements
Similar presentations
Privacy Science, Copyright Robert Thibadeau, CMU, Privacy Science Robert Thibadeau, Ph.D. Director, Internet Systems Laboratory
Advertisements

Automated Payment System. Benefits There is minimal training needed No expensive equipment necessary You can maintain your existing banking relationship.
Back to Table of Contents
Identity Management Based on P3P Authors: Oliver Berthold and Marit Kohntopp P3P = Platform for Privacy Preferences Project.
Minding Your Own Business The Platform for Privacy Preferences Project and Privacy Minder Lorrie Faith Cranor AT&T Labs-Research
The Platform for Privacy Preferences Project (P3P) Lorrie Faith Cranor AT&T Labs-Research P3P Interest Group Co-Chair October 1998.
Privacy No matter how exemplary your life is, there are things you want to keep to yourself © 2004, Lawrence Snyder.
Usable Privacy and Security Carnegie Mellon University Spring 2006 Cranor/Hong/Reiter 1 Design for Privacy 1 February.
C MU U sable P rivacy and S ecurity Laboratory 1 Privacy Policy, Law and Technology Engineering Privacy November 6, 2008.
Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong 1 User Studies Motivation January.
Extras Plus! Pepper. Objectives extra knowledge Cookies Picture handling when creating site.
Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong 1 Design for Privacy February 20,
Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong 1 Introduction to Privacy January.
Technological Implications for Privacy David Kotz Department of Computer Science Dartmouth College
Microsoft Passport Waldemar Swiercz.
Usable Privacy and Security Carnegie Mellon University Spring 2006 Cranor/Hong/Reiter 1 Course Overview January.
Computers and Society Carnegie Mellon University Spring 2007 Cranor/Tongia 1 Regulating Online Speech / Privacy.
C MU U sable P rivacy and S ecurity Laboratory Making privacy visible Lorrie Faith Cranor October 19, 2007.
Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 1 Design for Privacy February.
CMU Usable Privacy and Security Laboratory Power Strips, Prophylactics, and Privacy, Oh My! Julia Gideon, Serge Egelman, Lorrie.
Usable Privacy and Security Carnegie Mellon University Spring 2006 Cranor/Hong/Reiter 1 Introduction to Privacy.
Scams and Schemes. Today’s Objective I can understand what identity theft is and why it is important to guard against it, I can recognize strategies that.
Customer Service and Web Site Personalization Back to Table of Contents.
BTT12OI.  Do you know someone who has been scammed? What happened?  Been tricked into sending someone else money (not who they thought they were) 
The Privacy Tug of War: Advertisers vs. Consumers Presented by Group F.
CyLab Usable Privacy and Security Laboratory 1 CyLab Usable Privacy and Security Laboratory Design for.
July 25, 2005 PEP Workshop, UM A Single Sign-On Identity Management System Without a Trusted Third Party Brian Richardson and Jim Greer ARIES Lab.
How It Applies In A Virtual World
Automated Tracking of Online Service Policies J. Trent Adams 1 Kevin Bauer 2 Asa Hardcastle 3 Dirk Grunwald 2 Douglas Sicker 2 1 The Internet Society 2.
Usable Security – CS 6204 – Fall, 2009 – Dennis Kafura – Virginia Tech Privacy Preferences Edgardo Vega Usable Security – CS 6204 – Fall, 2009 – Dennis.
Created by Amber Craddock, Kylee Stone & Caleb Truette.
You can customize your privacy settings. The privacy page gives you control over who can view your content. At most only your friends, their friends and.
The World-Wide Web. Why we care? How much of your personal info was released to the Internet each time you view a Web page? How much of your personal.
C4- Social, Legal, and Ethical Issues in the Digital Firm
Privacy Policy, Law and Technology Carnegie Mellon University Fall 2004 Lorrie Cranor 1 Privacy and technology Week.
Staying Safe Online Keep your Information Secure.
Privacy, P3P and Internet Explorer 6 P3P Briefing – 11/16/01.
7-Oct-15 Threat on personal data Let the user be aware Privacy and protection.
How P3P Works Lorrie Faith Cranor P3P Specification Working Group Chair AT&T Labs-Research 4 February 2002
1 OPOL Training (OrderPro Online) Prepared by Christina Van Metre Independent Educational Consultant CTO, Business Development Team © Training Version.
Use of a P3P User Agent by Early Adopters Lorrie Faith Cranor Manjula Arjula Praven Guduru AT&T Labs November 2002.
1 Personalization and Trust Personalization Mass Customization One-to-One Marketing Structure content & navigation to meet the needs of individual users.
Location, Location, Location: The Emerging Crisis in Wireless Data Privacy Ari Schwartz & Alan Davidson Center for Democracy and Technology
BTT12OI.  Do you know someone who has been scammed online? What happened?  Been tricked into sending someone else money (not who they thought they were)
U.S. Department of Commerce Web Advisory Group Minding Your Own Business The Platform for Privacy Preferences Project.
Lecture 16 Page 1 CS 236 Online Web Security CS 236 On-Line MS Program Networks and Systems Security Peter Reiher.
Internet Safety Internet Safety LPM
Chapter 12: How Private are Web Interactions?. Why we care? How much of your personal info was released to the Internet each time you view a Web page?
Search Engine using Web Mining COMS E Web Enhanced Information Mgmt Prof. Gail Kaiser Presented By: Rupal Shah (UNI: rrs2146)
PRIVACY, LAW & ETHICS MBA 563. Source: eMarketing eXcellence Chaffey et al. BH Overview: Establishing trust and confidence in the online world.
Introduction With the development of the Internet a phenomenon known as 'electronic commerce' or 'ecommerce' for short, has been growing. Ecommerce has.
Executive Summary - Human Factors Heuristic Evaluation 04/18/2014.
Blogs How to use the bog safely and secure? Create new username. Create a strong password to your account. Create the password to your uploaded files.
Internet Privacy Define PRIVACY? How important is internet privacy to you? What privacy settings do you utilize for your social media sites?
Protecting your search privacy A lesson plan created & presented by Maria Bernhey (MLS) Adjunct Information Literacy Instructor
CMPE 494 Service-Oriented Architectures and Web Services Platform for Privacy Preferences Project (P3P) İDRİS YILDIZ
Unlinking Private Data
CS 115: COMPUTING FOR THE SOCIO-TECHNO WEB TECHNOLOGIES FOR PRIVATE (AND NOT-SO-PRIVATE) COMMUNICATIONS.
Walter Fletcher, Jeff Noles, Tiffany Russell, Shalonda Witcher
APAN SharePoint Permissions
"Our vision is to be earth's most customer-centric company; to build a place where people can come to find and discover anything they might want to buy.
How P3P Works Lorrie Faith Cranor P3P Specification Working Group Chair AT&T Labs-Research 4 February
APAN SharePoint Permissions
Protecting Your Credit
What is Cookie? Cookie is small information stored in text file on user’s hard drive by web server. This information is later used by web browser to retrieve.
WorldWidePIN Corporation
Privacy Policy, Law and Technology Online Privacy
IT and Society Week 2: Privacy.
The Platform for Privacy Preferences Project
Presentation transcript:

CMU Usable Privacy and Security Laboratory Hey, That’s Personal! Lorrie Faith Cranor 28 July

CMU Usable Privacy and Security Laboratory Lorrie Cranor 2 Outline Privacy risks from personalization Privacy risks from personalization Reducing privacy risks Reducing privacy risks Personalizing privacy Personalizing privacy

Privacy risks from personalization

CMU Usable Privacy and Security Laboratory Lorrie Cranor 4 Unsolicited marketing Desire to avoid unwanted marketing causes some people to avoid giving out personal information PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 5 My computer can “figure things out about me” The little people inside my computer might know it’s me… … and they might tell their friends PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 6 Inaccurate inferences “My TiVo thinks I’m gay!” PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 7 Surprisingly accurate inferences Everyone wants to be understood. No one wants to be known. PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 8 You thought that on the Internet nobody knew you were a dog… …but then you started getting personalized ads for your favorite brand of dog food PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 9 Price discrimination Concerns about being charged higher prices Concerns about being charged higher prices Concerns about being treated differently Concerns about being treated differently PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 10 Revealing private information to other users of a computer Revealing info to family members or co- workers Revealing info to family members or co- workers Gift recipient learns about gifts in advance Co-workers learn about a medical condition Revealing secrets that can unlock many accounts Revealing secrets that can unlock many accounts Passwords, answers to secret questions, etc. PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 11 The Cranor family’s 25 most frequent grocery purchases (sorted by nutritional value)! PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 12 Exposing secrets to criminals Stalkers, identity thieves, etc. Stalkers, identity thieves, etc. People who break into account may be able to access profile info People who break into account may be able to access profile info People may be able to probe recommender systems to learn profile information associated with other users People may be able to probe recommender systems to learn profile information associated with other users PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 13 Subpoenas Records are often subpoenaed in patent disputes, child custody cases, civil litigation, criminal cases Records are often subpoenaed in patent disputes, child custody cases, civil litigation, criminal cases PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 14 Government surveillance Governments increasingly looking for personal records to mine in the name of fighting terrorism Governments increasingly looking for personal records to mine in the name of fighting terrorism People may be subject to investigation even if they have done nothing wrong People may be subject to investigation even if they have done nothing wrong PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 15 Risks may be magnified in future Wireless location tracking Wireless location tracking Semantic web applications Semantic web applications Ubiquitous computing Ubiquitous computing PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 16 If you’re not careful, you may violate data protection laws Some jurisdictions have privacy laws that Some jurisdictions have privacy laws that Restrict how data is collected and used Require that you give notice, get consent, or offer privacy-protective options Impose penalties if personal information is accidently exposed PRIVACY RISKS

Reducing privacy risks

CMU Usable Privacy and Security Laboratory Lorrie Cranor 18 Tends to be MORE Privacy Invasive Tends to be LESS Privacy Invasive Implicit Explicit Persistent (profile) Transient (task or session) System initiated User initiated Predication basedContent based Axes of personalization Data collection method Duration User involvement Reliance on predictions REDUCING PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 19 A variety of approaches to reducing privacy risks No single approach will always work No single approach will always work Two types of approaches: Two types of approaches: Reduce data collection and storage Put users in control REDUCING PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 20 Collection limitation: Pseudonymous profiles Useful for reducing risk and complying with privacy laws when ID is not needed for personalization Useful for reducing risk and complying with privacy laws when ID is not needed for personalization But, profile may become identifiable because of unique combinations of info, links with log data, unauthorized access to user’s computer, etc. But, profile may become identifiable because of unique combinations of info, links with log data, unauthorized access to user’s computer, etc. Profile info should always be stored separately from web usage logs and transaction records that might contain IP addresses or PII Profile info should always be stored separately from web usage logs and transaction records that might contain IP addresses or PII REDUCING PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 21 Collection limitation: Client-side profiles Useful for reducing risk and complying with laws Useful for reducing risk and complying with laws Risk of exposure to other users of computer remains; storing encrypted profiles can help Risk of exposure to other users of computer remains; storing encrypted profiles can help Client-side profiles may be stored in cookies replayed to server that discards them after use Client-side profiles may be stored in cookies replayed to server that discards them after use Client-side scripting may allow personalization without ever sending personal info to the server Client-side scripting may allow personalization without ever sending personal info to the server For some applications, no reason to send data to server For some applications, no reason to send data to server REDUCING PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 22 Collection limitation: Task-based personalization Focus on data associated with current session or task - no user profile need be stored anywhere Focus on data associated with current session or task - no user profile need be stored anywhere May allow for simpler (and less expensive) system architecture too! May allow for simpler (and less expensive) system architecture too! May eliminate problem of system making recommendations that are not relevant to current task May eliminate problem of system making recommendations that are not relevant to current task Less “spooky” to users - relationship between current task and resultant personalization usually obvious Less “spooky” to users - relationship between current task and resultant personalization usually obvious REDUCING PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 23 Putting users in control Users should be able to control Users should be able to control what information is stored in their profile how it may be used and disclosed REDUCING PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 24 Developing good user interface to do this is complicated Setting preferences can be tedious Setting preferences can be tedious Creating overall rules that can be applied on the fly as new profile data is collected requires deep understanding and ability to anticipate privacy concerns Creating overall rules that can be applied on the fly as new profile data is collected requires deep understanding and ability to anticipate privacy concerns REDUCING PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 25 Possible approaches Provide reasonable default rules with the ability to add/change rules or specify preferences for handling of specific data Provide reasonable default rules with the ability to add/change rules or specify preferences for handling of specific data Up front With each action After-the-fact Explicit privacy preference prompts during transaction process Explicit privacy preference prompts during transaction process Allow multiple personae Allow multiple personae REDUCING PRIVACY RISKS

Example: Google Search History REDUCING PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 27 Amazon.com privacy makeover REDUCING PRIVACY RISKS

Streamline menu navigation for customization REDUCING PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 29 Provide way to set up default rules Every time a user makes a new purchase that they want to rate or exclude they have to edit profile info Every time a user makes a new purchase that they want to rate or exclude they have to edit profile info There should be a way to set up default rules  Exclude all purchases  Exclude all purchases shipped to my work address  Exclude all movie purchases  Exclude all purchases I had gift wrapped REDUCING PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 30 Remove excluded purchases from profile Users should be able to remove items from profile Users should be able to remove items from profile If purchase records are needed for legal reasons, users should be able to request that they not be accessible online If purchase records are needed for legal reasons, users should be able to request that they not be accessible online REDUCING PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 31 Better: options for controlling recent history REDUCING PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 32 Use personae Amazon already allows users to store multiple credit cards and addresses Amazon already allows users to store multiple credit cards and addresses Why not allow users to create personae linked to each with option of keeping recommendations and history separate (would allow easy way to separate work/home/gift personae)? Why not allow users to create personae linked to each with option of keeping recommendations and history separate (would allow easy way to separate work/home/gift personae)? REDUCING PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 33 Allow users to access all privacy- related options in one place Currently privacy-related options are found with relevant features Currently privacy-related options are found with relevant features Users have to be aware of features to find the options Users have to be aware of features to find the options Put them all in one place Put them all in one place But also leave them with relevant features But also leave them with relevant features REDUCING PRIVACY RISKS

CMU Usable Privacy and Security Laboratory Lorrie Cranor 34 I didn’t buy it for myself How about an “I didn’t buy it for myself” check-off box (perhaps automatically checked if gift wrapping is requested) I didn’t buy it for myself REDUCING PRIVACY RISKS

Personalizing privacy

CMU Usable Privacy and Security Laboratory Lorrie Cranor 36 Can we apply user modeling expertise to privacy? Personalized systems cause privacy concerns Personalized systems cause privacy concerns But can we use personalization to help address these concerns? But can we use personalization to help address these concerns? PERSONALIZING PRIVACY

CMU Usable Privacy and Security Laboratory Lorrie Cranor 37 What is privacy? “the claim of individuals… to determine for themselves when, how, and to what extent information about them is communicated to others.” - Alan Westin, 1967 PERSONALIZING PRIVACY

CMU Usable Privacy and Security Laboratory Lorrie Cranor 38 Privacy as process “Each individual is continually engaged in a personal adjustment process in which he balances the desire for privacy with the desire for disclosure and communication….” - Alan Westin, 1967 PERSONALIZING PRIVACY

CMU Usable Privacy and Security Laboratory Lorrie Cranor 39 But individuals don’t always engage in adjustment process Lack of knowledge about how info is used Lack of knowledge about how info is used Lack of knowledge about how to exercise control Lack of knowledge about how to exercise control Too difficult or inconvenient to exercise control Too difficult or inconvenient to exercise control Data collectors should inform users Data collectors should inform users Data collectors should provide choices and controls Data collectors should provide choices and controls Sounds like a job for a user model! Sounds like a job for a user model! PERSONALIZING PRIVACY

CMU Usable Privacy and Security Laboratory Lorrie Cranor 40 Example: Managing privacy at web sites Website privacy policies Website privacy policies Many posted Few read What if your browser could read them for you? What if your browser could read them for you? Warn you not to shop at sites with bad policies Automatically block cookies at those sites PERSONALIZING PRIVACY

CMU Usable Privacy and Security Laboratory Lorrie Cranor 41 Platform for Privacy Preferences (P3P) 2002 W3C Recommendation 2002 W3C Recommendation XML format for Web privacy policies XML format for Web privacy policies Protocol enables clients to locate and fetch policies from servers Protocol enables clients to locate and fetch policies from servers PERSONALIZING PRIVACY

CMU Usable Privacy and Security Laboratory Lorrie Cranor 42 Privacy Bird P3P user agent originally developed by AT&T P3P user agent originally developed by AT&T Free download and privacy search service at Free download and privacy search service at Compares user preferences with P3P policies Compares user preferences with P3P policies PERSONALIZING PRIVACY

Link to opt-out page PERSONALIZING PRIVACY

CMU Usable Privacy and Security Laboratory Lorrie Cranor 47 I would like to give the bird some feedback “I read this policy and actually I think it’s ok” “I read this policy and actually I think it’s ok” “I took advantage of the opt-out on this site so there is no problem” “I took advantage of the opt-out on this site so there is no problem” “This site is a banking site and I want to be extra cautious when doing online banking” “This site is a banking site and I want to be extra cautious when doing online banking” PERSONALIZING PRIVACY

CMU Usable Privacy and Security Laboratory Lorrie Cranor 48 Especially important if bird takes automatic actions Not critical when bird is only informational Not critical when bird is only informational But if bird blocks cookies, the wrong decision will get annoying But if bird blocks cookies, the wrong decision will get annoying PERSONALIZING PRIVACY

CMU Usable Privacy and Security Laboratory Lorrie Cranor 49 Can we learn user’s privacy preferences over time? Bad bird! PERSONALIZING PRIVACY

CMU Usable Privacy and Security Laboratory Lorrie Cranor 52 Other example applications for personalizing privacy Buddy lists: when to reveal presence information and to whom Buddy lists: when to reveal presence information and to whom Friend finder services: when to reveal location information and what level of detail Friend finder services: when to reveal location information and what level of detail Personalized ecommerce sites: when to start and stop recording my actions, which persona to use Personalized ecommerce sites: when to start and stop recording my actions, which persona to use PERSONALIZING PRIVACY

CMU Usable Privacy and Security Laboratory Lorrie Cranor 53 Conclusions Personalization often has real privacy risks Personalization often has real privacy risks Address these risks by minimizing data collection and storage, putting users in control Address these risks by minimizing data collection and storage, putting users in control Challenge: Can we make it easier for users to be in control by personalizing privacy? Challenge: Can we make it easier for users to be in control by personalizing privacy?

CMU Usable Privacy and Security Laboratory