Norman M. Sadeh ISR - School of Computer Science Carnegie Mellon University User-Controllable Security and Privacy
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 2 Privacy in Mobile & Pervasive Computing MyCampus project over the past 7 years Piloted a number of context-aware applications on campus Privacy as a major impediment to adoption Wikipedia’s definition of privacy: “… the ability of an individual or group to keep their lives and personal affairs out of public view, or to control the flow of information about themselves. Privacy is the ability of an individual or organization to reveal oneself selectively…”
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 3 Computational Thinking Challenge …But lay users (and even “experts”) are not very good at defining privacy policies… Complexity of people’s policies “One size fits all” often doesn’t apply Policies change over time Poor understanding of the consequences of how one’s information will be used Trust Engine technologies are ahead of usability research
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 4 Question Can we develop technologies that empower users to more accurately specify their policies? And some related questions such as: User burden vs. accuracy Incl. expressiveness issue How does this change from one application to another, from one user to another?
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 5 Three Application Domains MyCampus - Current focus: People Finder Grey – Defining policies to control access to rooms in a building IMBuddy – Contextual Instant Messaging
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 6 People Finder Architecture JimMary Combines GPS, GSM and WiFi Available on cell phones and laptops PEA = Policy Enforcing Agent Policies represented in rule extension of OWL language MyCampus Server Mary’s PEA Jim’s PEA Jim’s KB Mary’s KB
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 7 People’s Policies Are Often Varied & Complex User’s willingness to share their location depends on: Who is asking When Where they are What they are doing Who they are with And more…
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 8 People Finder – Defining Rules
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 9 Users Are Not Good At Defining Policies Mean (sec) Standard Deviation (sec) Rule Creation Rule Maintenance Total People Finder Application: Lab study with 19 users 30 queries per user
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 10 …and it’s not for lack of trying…
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 11 It’s Not Because of the Interface
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 12 Only Slight Correlation with # Rules -Total of 30 requests -Post-hoc accuracy
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 13 Only Slight Correlation with Time Spent -Total of 30 requests -Post-hoc accuracy
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 14 Some Users Realize They Can’t Get It Right Adoption Impediment
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 15 Approach Pervasive Computing Environments Pervasive Computing Environments User Interface CredentialsPolicies Policy Engine(s) Explanation Learning Dialog Policy Support Agent Meta- Control Legend: Project Focus Resource (incl. policies) Organization (incl. policies) Other users
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 16 Importance of Feedback - Notifications PeopleFinder application
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 17 Feedback – Summaries IMBuddy Application - Courtesy: Jason Hong
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 18 IMBuddy Evaluation Usefulness of bubble notification, 1.6 (σ=0.6) Scale of 1 to 5, where 1=strongly agree that it was useful, 3=neutral, 5=srongly disagree
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 19 Feedback Through Audit Logs Explanation
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 20 Machine Learning Audited Logs can be used to refine a user’s policies
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 21 Lab Study
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 22 More Recent Pilots – 12 most active target users 3 Pilots – total of over 60 participants User-Defined Rules: 79% vs. ML: 91% Note: Includes benefits of auditing
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 23 Ongoing Work Learning is a “black box” technology Users are unlikely to understand the policies they end up with Can we develop technology that incrementally suggests policy changes to users? Tradeoff between rapid convergence and maintaining policies that users can relate to
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 24 Policy Evolution
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 25 Other Promising Approaches Visualization Techniques Explanations & dialogues
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 26 Overall Vision New Technology Policy Creation Policy Enforcement Policy Auditing & Refinement My colleagues can see my location on weekdays between 8am and 5pm Jane Time Jane is in Oakland but I can’t access Eric’s location Jane and Eric are late for our meeting. Show me where they are! Bob’s Phone Bob Why couldn’t Bob see where I was? Bob is a colleague. So far only your friends can see where you are Eric What if my colleagues could see my location too? Eric In the past you denied access to your colleague Steve OK, make it just my superiors Policy Visualization Policy Enforcing Engines Explanation Dialog Learning from the past
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 27 Some of the Things We’ve Learned So Far Adoption will depend on whether users feel they have adequate control over the disclosure of their contextual information People often have rather complex privacy preferences People are not good at specifying their policies Not easy to identify good default policies beyond just denying all requests Policies tend to become more complex as users grow more sophisticated Allowing more requests but in an increasingly selective way Auditing is critical Learning, explanation & dialogs appear promising Applies to both privacy and security policies
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 28 Q&A Come & check out our poster this evening
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 29 Some References User-Controllable Security and Privacy Project: Norman Sadeh, Fabien Gandon and Oh Buyng Kwon, “Ambient Intelligence: The MyCampus Experience”, Chapter in "Ambient Intelligence and Pervasive Computing", Eds. T. Vasilakos and W. Pedrycz, ArTech House, (Also available as Tech. Report CMU-ISRI , School of Computer Science, Carnegie Mellon University) - gence%20Tech%20Report%20final.pdfAmbient Intelligence: The MyCampus Experience Jason Cornwell, Ian Fette, Gary Hsieh, Madhu Prabaker, Jinghai Rao, Karen Tang, Kami Vaniea, Lujo Bauer, Lorrie Cranor, Jason Hong, Bruce McLaren, Mike Reiter, Norman Sadeh, "User-Controllable Security and Privacy for Pervasive Computing", Proceedings of the 8th IEEE Workshop on Mobile Computing Systems and Applications (HotMobile 2007), February user-controllable-security-privacy%20submitted%20FINAL.pdfUser-Controllable Security and Privacy for Pervasive Computing user-controllable-security-privacy%20submitted%20FINAL.pdf M. Prabaker, J. Rao, I. Fette, P. Kelley, L. Cranor, J. Hong, and N. Sadeh, "Understanding and Capturing People's Privacy Policies in a People Finder Application", 2007 Ubicomp Workshop on Privacy, Austria, Sept. 2007Understanding and Capturing People's Privacy Policies in a People Finder Application
Copyright © Norman M. SadehCMU/Microsoft Mindswap – Oct Slide 30 Acknowledgements Collaborators: Faculty: L. Bauer, L Cranor, J. Hong, B. McLaren, M. Reiter, P. Steenkiste Post-Docs & Students: P. Drielsma, M. Prabaker, J. Rao, I. Fette, P. Kelley, K. Vaniea, R. Reeder, A Sardinha, J. Albertson, D. Hacker, J. Pincar, M. Weber. The work presented in these slides is supported in part by NSF Cyber Trust grant CNS and ARO research grant DAAD ("Perpetually Available and Secure Information Systems") to Carnegie Mellon University's CyLab.