Download presentation
Presentation is loading. Please wait.
1
Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech
2
Handouts The Challenges of Wearable Computing (Starner) Privacy Protection in the 1980’s (Turn) Excerpts from Agre and Rotenberg Technology and Privacy: The New Landscape
3
Background College of Computing, Georgia Tech Founder, Charmed Technologies Founder, MIT Wearable Computing Project IEEE ISWC and Wearable Information Systems TC Everyday use since 1993
4
Wearable Challenges Power and heat (mips/watt) On and off-body networking (bits/joule) Privacy Interface (additional capability vs. load) – User Interface (cognitive load) – Ergonomics/human factors (weight, heat, etc.) (Intertwined – changing one effects the others)
5
Resources (wearable and ubiquitous computing) Phil Agre’s Red Rock Eater list ACM technology alerts Foner “Political Artifacts and Personal Privacy: The Yenta Multi-Agent Distributed Matchmaking System” (MIT PhD thesis) Langheinrich “Privacy by Design” EPIC, EFF, ACLU, CPSR, Privacy Journal (on- line), privacyinterational.org, privacy.org Colleagues: Rhodes, Bruckman, Foner, Kapor, Mann, Pentland, wearables research mailing list, privacy panel ISWC98
6
Resources (general) Pool: Technologies of Freedom, The Social Impact of the Telephone Agre and Rotenberg: Technology and Privacy: the New Landscape Westin: Privacy and Freedom R.E. Smith: Our Vanishing Right to Privacy Rothfeder: Privacy for Sale Miller: The Assault on Privacy McCarthy: The rights of Publicity and Privacy Rosen: The Unwanted Gaze (also the article “Is Nothing Private?”)
7
Resources (general) Rosen: The Unwanted Gaze (also the article “Is Nothing Private Computers, Freedom, and Privacy IEEE Security and Privacy
8
Definitions Privacy – right of individuals to control the collection and use of personal information about themselves Security- protection of information from unauthorized users
9
Definition (Gellman) “No definition … is possible, because issues are fundamentally matters of values, interests, and power”
10
Black’s Law Dictionary Right of privacy: The right to be let alone; the right of a person to be free from unwanted publicity; and the right to live without unwarranted interference by the public in matters with which the public is not necessarily concerned … concept of ordered liberty, and such right prevents governmental interference in the intimage personal relationships or activities, freedoms of the individual to make fundamental choices involoving himself, his family, and his relationshkp with others.
11
Privacy Violation Tort – “civil wrong” Unreasonable intrusion upon the seclusion of another individual if such intrusion would be highly offensive to a reasonable person Appropriation of the other’s name or likeness for one’s own use or benefit
12
Privacy Violation (cont.) Unreasonable publicity given to the other’s private life if the published matter would be highly offensive to a reasonable person and is no concern of the public Publicity that unreasonably places the other in a false light before the public where the false light would be highly offensive to a reasonable person and the publisher knows the falsity of the published matter
13
Revolutionary War to Now Wilkes case in England Packwood diaries Clinton
14
What are PDAs and wearables? Filing cabinets? – Who bought the machine? – Who owns it? Diaries? – What is its use? – Separate section for private info?
15
A Selection of Cases “Fatty” Arbunckle – Tabloids, ubiquitous surveillance, and the changing social perception of smear campaigns Larry Flynn and the Republican witch hunt Linda Tripp Cellular phone monitoring – Newt Gingrich – Prince Charles – 911 emergency phone call EZ Pass
16
Anti-privacy Arguments If you have nothing to hide then you should have no concern for your privacy – Many personal situations might not want exposed (victim of rape, child abuse, fraud, …) – Opponents will use facts in the worst possible light (politicians, tabloids, etc.) – Unfair, unregulated environments (racism, health concerns, etc.)
17
Anti-privacy Arguments “I don’t care about privacy” – But other people have the right to care about theirs – Equivalent to saying “I don’t say anything controversial so therefore I don’t care about free speech”
18
Anti-privacy Arguments Privacy discussion is overblown. Big organizations don’t really care about individuals, just narrow goals. – Most harm is in the aggregate, but can still have large effects on the individual High cost home mortgage loans – FBI files - sabotage against nonviolent dissidents – Whistle blowers Ralph Nader and GM – Individuals against individuals – University professors and Freedom of Information act
19
Anti-privacy Arguments Surveillance is inevitable – the real issue is achieving a balance of power where we can watch the people who are watching us – Equal access to information is not the same as equal ability to use that information. A corporation or government can, and often do, dedicate resources to watching a person or group of people which few individuals can match. – Political/technical feasiblity of forcing corporations/governments/elite to comply
20
Anti-privacy Arguments Computer technology is just returning us to the rural village of yesteryear/ Technology brings nothing new to the privacy arguments – Simply incorrect. Computers allow large scale data- mining that was previously impossible with paper – In rural villages, did not have to worry about corporations with large budgets and large databases. A rural village implies some sense of parity.
21
Anti-privacy arguments We must balance privacy and industrial concerns/society/government expense – Assumes you can not have the same services in a privacy-preserving manner. In most cases you can – even at lower cost! – Ignores industry created by having active privacy protections
22
Anti-privacy Arguments Individuals can make up their own minds about what to reveal – Not if they don’t have the proper information – Currently, the companies that would like to benefit from your information inform the individual of risks – if they bother at all – This opinion pits one person’s capabilities against that of large companies with specialists who concentrate on exploiting such data. It also assumes technology will not make new uses of such data possible in the future
23
Anti-privacy Arguments There is no privacy in public – Reasonable expectation – U.S. law (used to be) designed to protect people not places (1967) – Aggregation of data, not individual instances – Just because it can be done doesn’t mean it isn’t wrong.
24
Anti-privacy Arugments Companies that distribute collections of personal data are protected under free speech laws – Who owns the bits? Is personal information property? If so, the rules of copyright and patent are accepted restrictions on the use of such speech
25
Anti-privacy Arguments Tagging a car or cellular phone does not equal tagging a person – circumstantial evidence – Circumstantial evidence is used all the time – both in and out of court (tabloids) – License plate – EZ Pass in NYC – License for more directed investigation/harassment
26
Anti-privacy arguments People on welfare should expect a reduction of privacy for benefits / the elite can violate privacy with enough money – why not make that available to everyone – The rights to privacy should not vary according to social class – no reason – In any implementation, someone can take advantage. That does not mean we should not design our systems as well as possible.
27
U.S. Privacy Act of 1974 (Turn and Langheinrich) Openness and transparency – no secret record keeping Individual participation- ability to see own records Collection limitation – don’t exceed needs Data quality- relevant to purpose and updated Use limitation- only authorized personnel for specific purpose Reasonable security Accountability
28
U.S. Privacy Act of 1974 Sounds reasonable, but.. Only applicable to federal agencies and certain contractors!!!! Conflict with Freedom of Information Acts!!
29
EU Directive 95/46/EC Data may be transferred only to non-EU countries with “adequate” levels of privacy protection Explicit consent U.S. Safe Harbor – countries self-certify – HP only major player
30
Detecting Privacy Violations Violations must be punishable and detectable Different aliases – E-mail – True names, addresses Trap entries – Bogus street names – Bogus student names/addresses Ralph Nader and General Motors
31
Leonard Foner “Those who design systems which handle personal information therefore have a special duty: They must not design systems which unnecessarily require, induce, persuade, or coerce individuals into giving up personal privacy in order to avail themselves of the benefit of the system being designed”
32
Risks of Ubiquitous Computing (Langheinrich) Ubiquity Invisibility Sensing Memory amplification – Bush’s Memex
33
Wearable vs. Environmental Approach Data collected on user Released by user Extreme form: all sensing powered by user – On-body sensing – RFID Wearable confounder Little brother vs. big brother
34
Privacy Barriers (Starner) Physical Technological – Encryption, biometrics, etc. Legislative – Changing laws, software monopolies, speed of innovation, enforcement Social Obscuring
35
Privacy by Design (Langheinrich) Notice Choice and consent Anonymity and pseudonymity Proximity and locality Adequate security Access and Recourse
36
Case study - RAVE
37
Case Studies – Active Badge Active badge system for security and convenience at a U.S. state university – Access to secure areas at a distance – Purchasing of snacks at vending machines – Tracking telephone calls – Auto-login – …
38
Dangers Security – Spoofing – Tracking – Traffic analysis – Social hacking – Bribery Legal access to data – FOI – Discrimination/harassment lawsuits
39
Case Studies Global Positioning System Locust Cellular phone 911 tracking Pagers EZPass Automobile black box
40
Case Studies (cont) Patent search system MIT Wearable Computing Project – Remembrance Agent Augmented memory to previous conversations Deniability Implicit, unpredictable violations of shared databases – Webcam
41
Case Studies (cont) Face Recognition – FaceIT – Eigenfaces – Social engagement Augmented Reality and Snowcrash’s CIC
42
Technologists Are the First Line of Protection Design the system so that it is easier and more economical to preserve privacy than violate it Provide mechanisms by which privacy violation can be detected Use combinations of mechanisms so that privacy and security can be adjusted for different social conditions and new threats Design with guidelines in mind
43
Benjamin Franklin They that can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.