Download presentation
Presentation is loading. Please wait.
Published byHarvey Dawson Modified over 9 years ago
1
Quantifying Location Privacy: The Case of Sporadic Location Exposure Reza Shokri George Theodorakopoulos George Danezis Jean-Pierre Hubaux Jean-Yves Le Boudec The 11th Privacy Enhancing Technologies Symposium (PETS), July 2011
2
2 ● Assume time and location are discrete…
3
Location-based Services Sporadic vs. Continuous Location Exposure Application Model 3 Mobility Model Actual Location of user ‘u’ at time ‘t’ Is the location exposed? 0/1
4
Protection Mechanisms 4 12345 678910 1112131415 1617181920 2122232425 Actual Location ● Consider a given user at a given time instant obfuscate anonymize 12345 678910 1112131415 1617181920 2122232425 Observed Location exposed Application hide fake Protection Mechanism uiui Actual Trajectory
5
Protection Mechanisms Model 5 ● User pseudonyms stay unchanged over time… user to pseudonym assignment Observed location of pseudonymous user u’ at time t
6
Adversary Background Knowledge – Stronger: Users’ transition probability between locations Markov Chain transition probability matrix – Weaker: Users’ location distribution over space Stationary distribution of the ‘transition probability matrix’ 6 ● Adversary also knows the PDFs associated to the ‘application’ and the ‘protection mechanism’
7
Adversary Localization Attack – What is the probability that Alice is at a given location at a specific time instant? (given the observation and adversary’s background knowledge) – Bayesian Inference relying on Hidden Markov Model Forward-Backward algorithm, Maximum weight assignment 7 ● Find the details of the attack in the paper
8
Location Privacy Metric Anonymity? – How successfully can the adversary link the user pseudonyms to their identities? – Metric: The percentage of correct assignments Location Privacy? – How correctly can the adversary localize the users? – Metric: Expected Estimation Error (Distortion) 8 ● Justification: R. Shokri, G. Theodorakopoulos, J-Y. Le Boudec, J-P. Hubaux. ‘Quantifying Location Privacy’. IEEE S&P 2011
9
Evaluation Location-Privacy Meter – Input: Actual Traces Vehicular traces in SF, 20 mobile users moving in 40 regions – Output: ‘Anonymity’ and ‘Location Privacy’ of users over time – Modules: Associated PDFs of ‘Location-based Application’ and ‘Location-Privacy Preserving Mechanisms’ 9 ● More information here: http://lca.epfl.ch/projects/quantifyingprivacy
10
Evaluation Location-based Applications – once-in-a-while APP(o, Θ ) – local search APP(s, Θ ) Location-Privacy Preserving Mechanisms – fake-location injection (with rate φ) (u) Uniform selection (g) Selection according to the average mobility profile – location obfuscation (with parameter ρ) ρ : The number of removed low-order bits from the location identifier 10 LPPM(φ, ρ, {u,g})
11
Results - Anonymity 11
12
Results – Location Privacy 12 φ: the fake-location injection rate
13
0 0.0 2 0.0 4 0.0 0 0.3 0.0 0 0.5 0.0 0 0.0 0.3 0 0.0 0.5 More Results – Location Privacy obfuscation fake injection hiding uniform selection
14
Conclusions & Future Work The effectiveness of ‘Location-Privacy Preserving Mechanisms’ cannot be evaluated independently of the ‘Location-based Application’ used by the users Fake-location injection technique is very effective for ‘sporadic location exposure’ applications – Advantage: no loss of quality of service – Drawback: more traffic exchange The ‘Location-Privacy Meter’ tool is enhanced in order to model the applications and also new protection mechanisms, notably fake- location injection Changing pseudonyms over time: to be added to our probabilistic framework 14
15
Location-Privacy Meter (LPM): A Tool to Quantify Location Privacy 15 http://lca.epfl.ch/projects/quantifyingprivacy
16
16
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.