Presentation is loading. Please wait.

Presentation is loading. Please wait.

CMU Usable Privacy and Security Laboratory Hey, That’s Personal! Lorrie Faith Cranor 28 July 2005

Similar presentations


Presentation on theme: "CMU Usable Privacy and Security Laboratory Hey, That’s Personal! Lorrie Faith Cranor 28 July 2005"— Presentation transcript:

1

2 CMU Usable Privacy and Security Laboratory http://cups.cs.cmu.edu/ Hey, That’s Personal! Lorrie Faith Cranor 28 July 2005 http://lorrie.cranor.org/

3 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 2 Outline Privacy risks from personalization Privacy risks from personalization Reducing privacy risks Reducing privacy risks Personalizing privacy Personalizing privacy

4 Privacy risks from personalization

5 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 4 Unsolicited marketing Desire to avoid unwanted marketing causes some people to avoid giving out personal information PRIVACY RISKS

6 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 5 My computer can “figure things out about me” The little people inside my computer might know it’s me… … and they might tell their friends PRIVACY RISKS

7 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 6 Inaccurate inferences “My TiVo thinks I’m gay!” PRIVACY RISKS

8 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 7 Surprisingly accurate inferences Everyone wants to be understood. No one wants to be known. PRIVACY RISKS

9 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 8 You thought that on the Internet nobody knew you were a dog… …but then you started getting personalized ads for your favorite brand of dog food PRIVACY RISKS

10 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 9 Price discrimination Concerns about being charged higher prices Concerns about being charged higher prices Concerns about being treated differently Concerns about being treated differently PRIVACY RISKS

11 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 10 Revealing private information to other users of a computer Revealing info to family members or co- workers Revealing info to family members or co- workers Gift recipient learns about gifts in advance Co-workers learn about a medical condition Revealing secrets that can unlock many accounts Revealing secrets that can unlock many accounts Passwords, answers to secret questions, etc. PRIVACY RISKS

12 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 11 The Cranor family’s 25 most frequent grocery purchases (sorted by nutritional value)! PRIVACY RISKS

13 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 12 Exposing secrets to criminals Stalkers, identity thieves, etc. Stalkers, identity thieves, etc. People who break into account may be able to access profile info People who break into account may be able to access profile info People may be able to probe recommender systems to learn profile information associated with other users People may be able to probe recommender systems to learn profile information associated with other users PRIVACY RISKS

14 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 13 Subpoenas Records are often subpoenaed in patent disputes, child custody cases, civil litigation, criminal cases Records are often subpoenaed in patent disputes, child custody cases, civil litigation, criminal cases PRIVACY RISKS

15 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 14 Government surveillance Governments increasingly looking for personal records to mine in the name of fighting terrorism Governments increasingly looking for personal records to mine in the name of fighting terrorism People may be subject to investigation even if they have done nothing wrong People may be subject to investigation even if they have done nothing wrong PRIVACY RISKS

16 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 15 Risks may be magnified in future Wireless location tracking Wireless location tracking Semantic web applications Semantic web applications Ubiquitous computing Ubiquitous computing PRIVACY RISKS

17 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 16 If you’re not careful, you may violate data protection laws Some jurisdictions have privacy laws that Some jurisdictions have privacy laws that Restrict how data is collected and used Require that you give notice, get consent, or offer privacy-protective options Impose penalties if personal information is accidently exposed PRIVACY RISKS

18 Reducing privacy risks

19 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 18 Tends to be MORE Privacy Invasive Tends to be LESS Privacy Invasive Implicit Explicit Persistent (profile) Transient (task or session) System initiated User initiated Predication basedContent based Axes of personalization Data collection method Duration User involvement Reliance on predictions REDUCING PRIVACY RISKS

20 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 19 A variety of approaches to reducing privacy risks No single approach will always work No single approach will always work Two types of approaches: Two types of approaches: Reduce data collection and storage Put users in control REDUCING PRIVACY RISKS

21 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 20 Collection limitation: Pseudonymous profiles Useful for reducing risk and complying with privacy laws when ID is not needed for personalization Useful for reducing risk and complying with privacy laws when ID is not needed for personalization But, profile may become identifiable because of unique combinations of info, links with log data, unauthorized access to user’s computer, etc. But, profile may become identifiable because of unique combinations of info, links with log data, unauthorized access to user’s computer, etc. Profile info should always be stored separately from web usage logs and transaction records that might contain IP addresses or PII Profile info should always be stored separately from web usage logs and transaction records that might contain IP addresses or PII REDUCING PRIVACY RISKS

22 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 21 Collection limitation: Client-side profiles Useful for reducing risk and complying with laws Useful for reducing risk and complying with laws Risk of exposure to other users of computer remains; storing encrypted profiles can help Risk of exposure to other users of computer remains; storing encrypted profiles can help Client-side profiles may be stored in cookies replayed to server that discards them after use Client-side profiles may be stored in cookies replayed to server that discards them after use Client-side scripting may allow personalization without ever sending personal info to the server Client-side scripting may allow personalization without ever sending personal info to the server For some applications, no reason to send data to server For some applications, no reason to send data to server REDUCING PRIVACY RISKS

23 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 22 Collection limitation: Task-based personalization Focus on data associated with current session or task - no user profile need be stored anywhere Focus on data associated with current session or task - no user profile need be stored anywhere May allow for simpler (and less expensive) system architecture too! May allow for simpler (and less expensive) system architecture too! May eliminate problem of system making recommendations that are not relevant to current task May eliminate problem of system making recommendations that are not relevant to current task Less “spooky” to users - relationship between current task and resultant personalization usually obvious Less “spooky” to users - relationship between current task and resultant personalization usually obvious REDUCING PRIVACY RISKS

24 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 23 Putting users in control Users should be able to control Users should be able to control what information is stored in their profile how it may be used and disclosed REDUCING PRIVACY RISKS

25 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 24 Developing good user interface to do this is complicated Setting preferences can be tedious Setting preferences can be tedious Creating overall rules that can be applied on the fly as new profile data is collected requires deep understanding and ability to anticipate privacy concerns Creating overall rules that can be applied on the fly as new profile data is collected requires deep understanding and ability to anticipate privacy concerns REDUCING PRIVACY RISKS

26 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 25 Possible approaches Provide reasonable default rules with the ability to add/change rules or specify preferences for handling of specific data Provide reasonable default rules with the ability to add/change rules or specify preferences for handling of specific data Up front With each action After-the-fact Explicit privacy preference prompts during transaction process Explicit privacy preference prompts during transaction process Allow multiple personae Allow multiple personae REDUCING PRIVACY RISKS

27 Example: Google Search History REDUCING PRIVACY RISKS

28 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 27 Amazon.com privacy makeover REDUCING PRIVACY RISKS

29 Streamline menu navigation for customization REDUCING PRIVACY RISKS

30 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 29 Provide way to set up default rules Every time a user makes a new purchase that they want to rate or exclude they have to edit profile info Every time a user makes a new purchase that they want to rate or exclude they have to edit profile info There should be a way to set up default rules  Exclude all purchases  Exclude all purchases shipped to my work address  Exclude all movie purchases  Exclude all purchases I had gift wrapped REDUCING PRIVACY RISKS

31 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 30 Remove excluded purchases from profile Users should be able to remove items from profile Users should be able to remove items from profile If purchase records are needed for legal reasons, users should be able to request that they not be accessible online If purchase records are needed for legal reasons, users should be able to request that they not be accessible online REDUCING PRIVACY RISKS

32 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 31 Better: options for controlling recent history REDUCING PRIVACY RISKS

33 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 32 Use personae Amazon already allows users to store multiple credit cards and addresses Amazon already allows users to store multiple credit cards and addresses Why not allow users to create personae linked to each with option of keeping recommendations and history separate (would allow easy way to separate work/home/gift personae)? Why not allow users to create personae linked to each with option of keeping recommendations and history separate (would allow easy way to separate work/home/gift personae)? REDUCING PRIVACY RISKS

34 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 33 Allow users to access all privacy- related options in one place Currently privacy-related options are found with relevant features Currently privacy-related options are found with relevant features Users have to be aware of features to find the options Users have to be aware of features to find the options Put them all in one place Put them all in one place But also leave them with relevant features But also leave them with relevant features REDUCING PRIVACY RISKS

35 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 34 I didn’t buy it for myself How about an “I didn’t buy it for myself” check-off box (perhaps automatically checked if gift wrapping is requested) I didn’t buy it for myself REDUCING PRIVACY RISKS

36 Personalizing privacy

37 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 36 Can we apply user modeling expertise to privacy? Personalized systems cause privacy concerns Personalized systems cause privacy concerns But can we use personalization to help address these concerns? But can we use personalization to help address these concerns? PERSONALIZING PRIVACY

38 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 37 What is privacy? “the claim of individuals… to determine for themselves when, how, and to what extent information about them is communicated to others.” - Alan Westin, 1967 PERSONALIZING PRIVACY

39 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 38 Privacy as process “Each individual is continually engaged in a personal adjustment process in which he balances the desire for privacy with the desire for disclosure and communication….” - Alan Westin, 1967 PERSONALIZING PRIVACY

40 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 39 But individuals don’t always engage in adjustment process Lack of knowledge about how info is used Lack of knowledge about how info is used Lack of knowledge about how to exercise control Lack of knowledge about how to exercise control Too difficult or inconvenient to exercise control Too difficult or inconvenient to exercise control Data collectors should inform users Data collectors should inform users Data collectors should provide choices and controls Data collectors should provide choices and controls Sounds like a job for a user model! Sounds like a job for a user model! PERSONALIZING PRIVACY

41 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 40 Example: Managing privacy at web sites Website privacy policies Website privacy policies Many posted Few read What if your browser could read them for you? What if your browser could read them for you? Warn you not to shop at sites with bad policies Automatically block cookies at those sites PERSONALIZING PRIVACY

42 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 41 Platform for Privacy Preferences (P3P) 2002 W3C Recommendation 2002 W3C Recommendation XML format for Web privacy policies XML format for Web privacy policies Protocol enables clients to locate and fetch policies from servers Protocol enables clients to locate and fetch policies from servers PERSONALIZING PRIVACY

43 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 42 Privacy Bird P3P user agent originally developed by AT&T P3P user agent originally developed by AT&T Free download and privacy search service at http://privacybird.com/ Free download and privacy search service at http://privacybird.com/ Compares user preferences with P3P policies Compares user preferences with P3P policies PERSONALIZING PRIVACY

44

45

46

47 Link to opt-out page PERSONALIZING PRIVACY

48 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 47 I would like to give the bird some feedback “I read this policy and actually I think it’s ok” “I read this policy and actually I think it’s ok” “I took advantage of the opt-out on this site so there is no problem” “I took advantage of the opt-out on this site so there is no problem” “This site is a banking site and I want to be extra cautious when doing online banking” “This site is a banking site and I want to be extra cautious when doing online banking” PERSONALIZING PRIVACY

49 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 48 Especially important if bird takes automatic actions Not critical when bird is only informational Not critical when bird is only informational But if bird blocks cookies, the wrong decision will get annoying But if bird blocks cookies, the wrong decision will get annoying PERSONALIZING PRIVACY

50 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 49 Can we learn user’s privacy preferences over time? Bad bird! PERSONALIZING PRIVACY

51

52

53 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 52 Other example applications for personalizing privacy Buddy lists: when to reveal presence information and to whom Buddy lists: when to reveal presence information and to whom Friend finder services: when to reveal location information and what level of detail Friend finder services: when to reveal location information and what level of detail Personalized ecommerce sites: when to start and stop recording my actions, which persona to use Personalized ecommerce sites: when to start and stop recording my actions, which persona to use PERSONALIZING PRIVACY

54 CMU Usable Privacy and Security Laboratory Lorrie Cranor http://lorrie.cranor.org/ 53 Conclusions Personalization often has real privacy risks Personalization often has real privacy risks Address these risks by minimizing data collection and storage, putting users in control Address these risks by minimizing data collection and storage, putting users in control Challenge: Can we make it easier for users to be in control by personalizing privacy? Challenge: Can we make it easier for users to be in control by personalizing privacy?

55 CMU Usable Privacy and Security Laboratory http://cups.cs.cmu.edu/


Download ppt "CMU Usable Privacy and Security Laboratory Hey, That’s Personal! Lorrie Faith Cranor 28 July 2005"

Similar presentations


Ads by Google