Presentation is loading. Please wait.

Presentation is loading. Please wait.

© 2011 Peter Hornyack These Aren’t the Droids You’re Looking For Peter Hornyack, Seungyeop Han, Jaeyeon Jung, Stuart Schechter, David Wetherall Retrofitting.

Similar presentations


Presentation on theme: "© 2011 Peter Hornyack These Aren’t the Droids You’re Looking For Peter Hornyack, Seungyeop Han, Jaeyeon Jung, Stuart Schechter, David Wetherall Retrofitting."— Presentation transcript:

1 © 2011 Peter Hornyack These Aren’t the Droids You’re Looking For Peter Hornyack, Seungyeop Han, Jaeyeon Jung, Stuart Schechter, David Wetherall Retrofitting Android to Protect Data from Imperious Applications

2 © 2011 Peter Hornyack Would you install this application? CCS - October 17-21, 2011 2  Android permission system:  Permissions requested by application at install-time  User must grant all permissions or cancel installation  App developers hold the power: give users an ultimatum

3 © 2011 Peter Hornyack Applications can’t be trusted  Recent academic research corroborates these findings CCS - October 17-21, 2011 3

4 © 2011 Peter Hornyack What is the threat?  Android applications that misappropriate the user’s privacy-sensitive data  Transmit sensitive data that the user intends application to use on-device only  Transmit sensitive data to third parties CCS - October 17-21, 2011 4

5 © 2011 Peter Hornyack  Measurement study of sensitive data usage  AppFence: a defense against misappropriation of sensitive data  Framework for evaluating impact on user’s experience  Evaluation of AppFence on 50 applications Outline  Measurement study of sensitive data usage  AppFence: a defense against misappropriation of sensitive data  Framework for evaluating impact on user’s experience  Evaluation of AppFence on 50 applications CCS - October 17-21, 2011 5

6 © 2011 Peter Hornyack What is “sensitive data”? CCS - October 17-21, 2011 6 device id location phone number contacts camera accounts logs microphone SMS messages history & bookmarks calendar subscribed feeds  We identified 12 types of privacy- sensitive data on Android

7 © 2011 Peter Hornyack How can we tell what apps are doing? CCS - October 17-21, 2011 7 loc = getLocation(); //taint tag applied... loc_copy = loc; //taint propagated... network_send(loc_copy); //checked for taint  TaintDroid: dynamic taint tracking for Android applications [Enck10] Gives us runtime detection of sensitive data transmission for unmodified apps  Apps can’t transform, obfuscate or encrypt data to remove taint  We enhanced TaintDroid: added tracking for all 12 data types (example taken from William Enck OSDI’10)

8 © 2011 Peter Hornyack Our study of sensitive data usage  We performed an extensive study of sensitive data usage by Android apps  110 popular free apps from Android Market  Selected to cover all 12 sensitive data types  Manually executed each app for ~5 minutes  Used TaintDroid to measure types of sensitive data sent out and destinations sent to CCS - October 17-21, 2011 8

9 © 2011 Peter Hornyack 73 apps Appears that some apps use sensitive data only for purpose of sharing with third parties CCS - October 17-21, 2011 9 Do apps need my sensitive data? What we found for location data (110 apps): Location? Android Application 45 apps Third parties 30 apps Of these 30 apps, 28 sent location only to third parties!

10 © 2011 Peter Hornyack 83 apps Multiple apps send device ID to same third parties: risk of cross-application profiling is real CCS - October 17-21, 2011 10 Could they be tracking me? What we found for unique device IDs (110 apps): Device ID? Android Application 31 apps Third parties 14 apps Just 3 third party destinations: mobclix, flurry, greystripe

11 © 2011 Peter Hornyack What else do apps misappropriate?  Two apps sent out the user’s phone number for no apparent reason except tracking  Call blocking app sent out user’s entire contacts book, then asked user to opt-in CCS - October 17-21, 2011 11 Host: mobile.dilbert.com Cookie: pn=12067084513; im=310410118469136 Sensitive data intended only for on-device use may be sent off the device Mr. Number

12 © 2011 Peter Hornyack Outline  Measurement study of sensitive data usage  AppFence: a defense against misappropriation of sensitive data  Framework for evaluating impact on user’s experience  Evaluation of AppFence on 50 applications CCS - October 17-21, 2011 12

13 © 2011 Peter Hornyack How can we defend against these apps?  Threat: applications may misappropriate users’ sensitive data  We have a tool, TaintDroid, that can monitor unmodified applications at runtime  Can we do something simple to unmodified applications to defend against this threat? CCS - October 17-21, 2011 13 Our system: AppFence

14 © 2011 Peter Hornyack AppFence uses two privacy controls  Two complementary privacy controls:  Shadowing: app doesn’t get sensitive data at all  Blocking: app gets sensitive data, but can’t send it out CCS - October 17-21, 2011 14 Data shadowing Exfiltration blocking Unmodifie d Application Android Sensitive data Sensitive data External servers

15 © 2011 Peter Hornyack Without data shadowing: How data shadowing works Unmodifie d Application Phone #? (206) 555-4321 analytics.com (206) 555-4321 (123) 456-7890 CCS - October 17-21, 2011 15 Shadow data With data shadowing: Android

16 © 2011 Peter Hornyack Three kinds of shadow data CCS - October 17-21, 2011 16  Blank data  e.g. contacts: {S. Han, 206-555-4321}  {}  Fake data  e.g. location: {47.653,-122.306}  {41.887,-87.619}  Constructed data  e.g. device ID = hash(app name, true device ID)  Consistent for each application, but different across applications

17 © 2011 Peter Hornyack Android How exfiltration blocking works Unmodifie d Application Phone #? (206) 555-4321 analytics.com (206) 555-4321 CCS - October 17-21, 2011 17 Without exfiltration blocking:With exfiltration blocking: Airplane mode: no network available

18 © 2011 Peter Hornyack Outline  Measurement study of sensitive data usage  AppFence: a defense against misappropriation of sensitive data  Framework for evaluating impact on user’s experience  Evaluation of AppFence on 50 applications CCS - October 17-21, 2011 18

19 © 2011 Peter Hornyack  Privacy controls may cause changes in application behavior  We decided to measure the impact of AppFence on the user’s experience  How can we measure this?  Look for user-visible changes in application behavior: side effects CCS - October 17-21, 2011 19 What should we measure?

20 © 2011 Peter Hornyack An example of a side effect CCS - October 17-21, 2011 20  We look for user-visible changes in application screenshots:

21 © 2011 Peter Hornyack Framework for measuring side effects  Automate application execution by using an Android GUI testing program  Converts a script of high-level commands (e.g. “press button,” “select from menu”) into GUI interactions  Captures screenshot after every command  A human detects side effects by comparing screenshots taken with and without AppFence enabled CCS - October 17-21, 2011 21

22 © 2011 Peter Hornyack How we check for side effects BaselineAppFence CCS - October 17-21, 2011 22 Diff

23 © 2011 Peter Hornyack Classifying applications CCS - October 17-21, 2011 23  We classified each application based on the side effects observed:  None  Ads absent  Less functional  Broken

24 © 2011 Peter Hornyack Side effect: none BaselineAppFenceDiff CCS - October 17-21, 2011 24

25 © 2011 Peter Hornyack Side effect: ads absent BaselineAppFenceDiff CCS - October 17-21, 2011 25

26 © 2011 Peter Hornyack Side effect: less functional BaselineAppFenceDiff CCS - October 17-21, 2011 26

27 © 2011 Peter Hornyack Side effect: broken BaselineAppFenceDiff 27 CCS - October 17-21, 2011

28 © 2011 Peter Hornyack Outline  Measurement study of sensitive data usage  AppFence: a defense against misappropriation of sensitive data  Framework for evaluating impact on user’s experience  Evaluation of AppFence on 50 applications CCS - October 17-21, 2011 28

29 © 2011 Peter Hornyack Experiments  Selected 50 apps that sent out sensitive data  Wrote execution scripts for these apps  Exercise main features and features likely to send out sensitive data (average 24 commands)  Enable one AppFence privacy control, execute all applications (~3 hours computer time)  Check screenshots for side effects and classify applications (~30 minutes human time) CCS - October 17-21, 2011 29

30 © 2011 Peter Hornyack How did we configure privacy controls? CCS - October 17-21, 2011 30  To reveal the most side effects:  Data shadowing of all sensitive data types  Exfiltration blocking of all types to all destinations  This imposes a policy on the app: sensitive data should never leave the device  But don’t some apps have legitimate need to send out data?

31 © 2011 Peter Hornyack Choose least- disruptive 30 (60%) 3 (6%) 11 (22%) 6 (12%) Side effects shown by 50 apps CCS - October 17-21, 2011 31 Data shadowing Exfiltration blocking None 28 (56%)16 (32%) Ads absent 0 (0%)11 (22%) Less functional 14 (28%)10 (20%) Broken 8 (16%)13 (26%)  Choose the control that caused least-severe side effects for each app: 33 apps (66%) had no side effects or ads absent  We used profiling to choose; determining in advance is challenging  Remember, we applied a single privacy control (one or the other) to all applications  Slightly more than half of the apps ran with limited or no side effects  Data shadowing was less disruptive than exfiltration blocking

32 © 2011 Peter Hornyack CCS - October 17-21, 2011 32 So 34% of applications didn’t work?  These apps had four kinds of functionality that directly conflict with our configuration (sensitive data should never leave the device):  Location broadcast (location)  Geographic search (location)  Find friends (contacts)  Cross-application gaming profiles (device ID)

33 © 2011 Peter Hornyack What does this mean for AppFence?  Some applications force the user to choose between functionality and privacy  Protecting sensitive data will always cause side effects for these applications  Remaining apps: AppFence can prevent misappropriation without side effects  Choosing the least-disruptive privacy control in advance is still an open problem  Each control was less disruptive for certain sensitive data types CCS - October 17-21, 2011 33

34 © 2011 Peter Hornyack When to use data shadowing  Data types such as device ID, location, phone number  Aren’t presented directly to the user  Must be transmitted off the device  Example application behaviors:  Device ID sent along with login information  Location collected at application launch CCS - October 17-21, 2011 34

35 © 2011 Peter Hornyack When to use exfiltration blocking  Data types such as contacts, SMS, calendar  Presented to the user on the device  Don’t need to be transmitted off the device  Example application behaviors:  Selecting a contact to send a message to  Adding reminders to calendar CCS - October 17-21, 2011 35

36 © 2011 Peter Hornyack Conclusion CCS - October 17-21, 2011 36  AppFence breaks the power of the installation ultimatum  We revealed side effects by never allowing sensitive data to leave the device  Some apps: user must choose between functionality and privacy  Majority of apps: two privacy controls can prevent misappropriation without side effects

37 © 2011 Peter Hornyack Questions? CCS - October 17-21, 2011 37 Source code and execution scripts available at: appfence.org

38 © 2011 Peter Hornyack CCS - October 17-21, 2011 38


Download ppt "© 2011 Peter Hornyack These Aren’t the Droids You’re Looking For Peter Hornyack, Seungyeop Han, Jaeyeon Jung, Stuart Schechter, David Wetherall Retrofitting."

Similar presentations


Ads by Google