Presentation is loading. Please wait.

Presentation is loading. Please wait.

Trust Online and the Phishing Problem: why warnings are not enough M. Angela Sasse (based on work by Iacovos Kirlappos, Katarzyna Krol, Matthew Moroz)

Similar presentations


Presentation on theme: "Trust Online and the Phishing Problem: why warnings are not enough M. Angela Sasse (based on work by Iacovos Kirlappos, Katarzyna Krol, Matthew Moroz)"— Presentation transcript:

1 Trust Online and the Phishing Problem: why warnings are not enough M. Angela Sasse (based on work by Iacovos Kirlappos, Katarzyna Krol, Matthew Moroz) Department of Computer Science & SECReT Doctoral Training Centre, UCL 22/09/2011

2 Outline Basics of trust 2 lab studies on an anti-phishing tool and security warnings … which explain why current signals don’t work What can we do? –Design –Communication to user 2

3 What is trust? Trust is only required in the presence of risk and uncertainty “… willingness to be vulnerable, based on positive expectations about the actions of others” M. Bacharach & D. Gambetta 2001. Trust as Type Detection. In: Castelfranchi, C. & Tan, Y. Trust and Deception in Virtual Societies.

4 Why? Economic Benefits

5 Ignore these at your peril … trust = split-second assessment, rather than thorough risk analysis and assurance reliance = after several successful transactions, no perceived vulnerability = split of a split-second assessment

6 How do we decide when to trust? People assessment of transaction partner’s ability and motivation [Deutsch, 1956] We look for cues (trust signals) that indicate these This assessment can be based on –cognitive elements (rational) –affective reactions (pre- cognitive)

7 TRUSTEE TRUSTOR

8 1 Signals TRUSTEETRUSTOR

9 Outside Option 1 Signals TRUSTEETRUSTOR 2a Trusting Action 2b Withdrawal RISK

10 Outside Option 1 Signals TRUSTEETRUSTOR 2a Trusting Action 2b Withdrawal 3a Fulfilment 3b Defection RISK

11 Dis-embedding Interaction is stretched over time and space and involves complex socio- technical systems [Giddens, 1990] … pervasive in modern societies (e.g. catalogue shopping) So – what’s so special about trust online? Increased risk –Privacy (more data required) –Security (open system) –Own ability (errors) Increased uncertainty –Inexperienced with decoding cues –Fewer surface cues available –Traditional cues no long useful J. Riegelsberger, M. A. Sasse, & J. D. McCarthy: The Mechanics of Trust. Int J of Human-Computer Studies 2005.

12 Study 1: phishing Passive phishing indicators (Spoofstick etc.) have limited effect –Users don’t look at indicators –Users don’t know what indicators mean –Require users to disrupt their main task –Time-consuming and error-prone R. Dhamija et al.: Why Phishing Works. Procs ACM CHI 2006 Schechter et al.: The Emperor’s New Security Indicators IEEE Security & Privacy 2007

13 Are active anti-phishing tools better? Example: SOLID by First Cyber Security Traffic Light approach: –Passive indicator when no risk exists –Becomes active when a risk is identified

14 Safe Website Green

15 “Extreme Caution” Shows up only when the website the users attempt to visit is certainly unsafe Presents three options: ― Redirection to the authentic website (Default option) ― Close the window ― Proceed to the risky site

16 Results – Active Warning “Extreme Caution” window resulted to 17 out of 18 participants visiting the genuine website. –Clear information –Right timing –Context-specific Safe Default is important. –Users clicked “OK” without fully understanding the meaning of the message they have been presented with –They were redirected to the genuine website

17 Results – did they still take risks? Tool reduced number of participants taking risks, But: some still take risks Potential Payoff Number of participants ControlSOLID £10510 (green) £35-40128 (grey/yellow) £2010 (grey)

18 Why do users ignore the recommendation? Price = main factor for ignoring the tool  Need And Greed Principle (Stajano & Wilson: Understanding Scam Victims Comm ACM March 2011) General advice like “If it is too good to be true, it usually is” doesn’t work

19 Participants believe they can rely on their own ability to identify scam websites, and ignore the tool Past experience with high false-positives creates a negative attitude towards security indicators Cormac Herley: security tools/advice offering a poor cost-benefit will be rejected by users C. Herley: So Long, And No Thanks for all the Externalities Procs NSPW 2009 “I know better …”

20 Other trust cues Perceived familiarity (reliance) Mentioning other entities – Facebook and Twitter logos Ads – “Why would anyone pay to advertise on a dog site?”, mention of charities Lots of info, privacy policies, and good design 20

21 Symbols of trust arbitrarily assigned meaning specifically created to signify the presence of trust-warranting properties must be difficult to forge (mimicry) and sanctions in the case of misuse expensive –trustor has to know about their existence and how to decode them. At the –trustees need to invest in emitting them and in getting them known

22 Symptoms of trust not specifically created to signal trust-warranting properties – rather, by-products of the activities of trustworthy actors e.g. trustworthy online retailer has large customer base, repeat business exhibiting symptoms of trust incurs no cost for trustworthy actors, whereas untrustworthy actors would have to invest effort mimic those signals

23 Study 2: pdf warnings 23 Most common file types in targeted attacks in 2009. Source: F-Secure (2010)

24 The experiment Two conditions: between-subjects design Participant task: reading two articles and evaluating their summaries –choosing the first article: no warning –choosing the second article: a warning with each article the participants tried 24

25 General results 120 participants (64 female, mean age 25.7) 8 χ 2 =1.391 p=0.238 df=1 Warning typeDownloadedRefused Generic528 Specific4614 ∑9822

26 Gender differences Women were more cautious and less likely to download an article with a warning DownloadRefusal Male506 Female4816 χ2=4.071, p=0.044, df=1 26

27 Eye-tracking data Fixation time in seconds –By warning type 6.13 for generic warnings 6.33 for specific warnings –By subsequent reaction 6.94 for those who subsequently refused to download 5.63 for those who subsequently downloaded the article 27 No significant difference between the length of fixation, all participants were fairly attentive to the warning regardless of the text, but just took different decisions

28 Hypothetical vs. observed behaviour DownloadRefusal to download Hypothetical528 Self-observed4119 χ 2 = 6.039, p = 0.014 14 DownloadRefusal to download Hypothetical4614 Self-observed1347 χ 2 = 36.31, p < 0.0001 Generic warning Specific warning

29 Reasons for ignoring warning Desensitisation (55 participants): past experience of false positives 15

30 Reasons for ignoring warning Trusting the source (29) “It depends on what the source was, if I was getting it from a dodgy website, I probably wouldn’t download it. But if something was sent to me by a friend or a lecturer or I was downloading it from a library catalogue, I would have opened it anyway.” 17

31 Reasons for ignoring warning Trusting anti-virus (18) I trusted that the anti-virus on my computer would pick anything up. Trusting PDF (15) I don’t think PDF files can have this kind of harm in them. It says ‘PDF files can harm your computer’ and I know they can’t. 18

32 Why security warnings don’t work Warnings are not reliable and badly designed –more noise than signals –interrupt users’ primary task –pop-ups are associated with adverts and updates = ANNOYING!!! Users have misconceptions: –about risks and indicators –about their own competence 19

33 Conclusions: What can be done? 1.Re-design the interaction: eliminate choice, automatically direct users to safe sites 2.More effective trust signalling: develop symptoms of trust and protect symbols better 3.Get rid of useless warnings 4.Better communication about risks, correct misconceptions about trust signals 20

34 Good Human Factors – by a security person 1.The system must be substantially, if not mathematically, undecipherable; 2.The system must not require secrecy and can be stolen by the enemy without causing trouble; 3.It must be easy to communicate and remember the keys without requiring written notes, it must also be easy to change or modify the keys with different participants; 4.The system ought to be compatible with telegraph communication; 5.The system must be portable, and its use must not require more than one person; 6.Finally, regarding the circumstances in which such system is applied, it must be easy to use and must neither require stress of mind nor the knowledge of a long series of rules. Auguste Kerckhoffs, ‘La cryptographie militaire’, Journal des sciences militaires, vol. IX, pp. 5–38, Jan. 1883, pp. 161–191, Feb. 1883.


Download ppt "Trust Online and the Phishing Problem: why warnings are not enough M. Angela Sasse (based on work by Iacovos Kirlappos, Katarzyna Krol, Matthew Moroz)"

Similar presentations


Ads by Google