Download presentation
Presentation is loading. Please wait.
1
Usability and Security – Why we need to look at the big picture M. Angela Sasse Professor of Human-Centred Technology Department of Computer Science University College London, UK a.sasse@cs.ucl.ac.uk www.ucl.cs.ac.uk/staff/A.Sasse
2
2 Acknowledgements My Doctoral Students –Anne Adams- Dirk Weirich –Sacha Brostoff- Ivan Flechais Marek Rejmann-Greene (BT Exact) BIOVISION (EU Roadmap Project) UK e-Science Programme German Federal Office for Information Security
3
3 Overview Yes, security needs usability Beyond UI design: changing user behaviour Organisational factors –Designing and maintaining security culture –Taking responsibility Changing the development process –Involving all stakeholders –Changing the design/development process
4
4
5
5 Human Memory Limited capacity Decays over time (items cannot be recalled at all or not 100% correct) Frequent recall improves memorability Unaided recall is harder than recognition Non-meaningful items much harder to recall than meaningful ones Similar items are easily confused Items linger - cannot “forget on demand”
6
6 Password Systems Require unaided recall Entry must be 100% correct Not meaningful Many similar items compete –Frequently change –Proliferation passwords and PINs (banking, phones, websites)
7
7 So users write them down …
8
8 Consequences of unusable password systems Cost of secure re-setting (help desks) is high security undermined by –cheap reset techniques (reminders) –user workarounds Organisations where everybody has password problems are vulnerable to social engineering attacks
9
9 Research challenges for authentication Different mechanisms for frequent and infrequently used passwords Make mechanisms more forgiving –Replace “all or nothing” with more forgiving mechanism –Provide feedback and instructions –Cued rather than unaided recall –Keep password changes to a minimum while making it sufficiently secure (including spouse-proof), AND providing universal access
10
10 Is biometrics the answer? Potential to reduce mental workload But –No single biometric provides universal access –False rejection rates still high –Current equipment has raft of usability issues
11
11 Fingerprint readers
12
12 Which finger was it again?
13
13 Height adjustment
14
14 Iris – difficulty focussing
15
15 Where do I stand?
16
16 Where am I supposed to look?
17
17 Mental Models/Metaphors “Why Johnny Can’t Encrypt” [Whitten & Tygar, Procs USENIX 1999] UI problems User Tasks not represented Misleading labels Lack of feedback Problem lies deeper: “key” cues the wrong mental model, usage of “public” “private” does not match everyday use of language
18
18 Research Challenges: simple concepts or metaphors Move from “shorthand” metaphors for security community to metaphors that work for wider user base Adopt/adapt conceptual design approach to make security concepts & tools more accessible –Identify suitable metaphors –Engineer system and discourse to communicate these
19
19 User Knowledge “Users are not the Enemy” [Adams & Sasse, 1999]: employees’ and managers’ knowledge about security is sketchy Replicated by Fitzgibbons et al. [Univ. of Colorado, 2003] “Namedropping” of security concepts and unwarranted assumptions among software developers [Flechais & Sasse, 2003] User education about security needed
20
20 User knowledge of security “You know, if you think about, who’s actually going to go through all that struggle to hack a departmental computer science account of some academic at xxxx college. It’s not like NASA or anything, nothing of interest. “What would make it more likely?” Answer: “Maybe if I was more famous, or…” [laughs] Weirich & Sasse, Procs NSPW 2003
21
21 “Adams & Sasse propose that educating users in security is a solution for the problem of chosing weak passwords. They claim that if users receive specific security training and understand security models, they will select secure passwords and refrain from insecure behaviour. In our study, however, we discovered that the level of security training did not prevent users from choosing trivial passwords and refrain from engaging in insecure behaviour.” [Dhamija & Perrig, 2001]
22
22 Education vs. Training Education not one-way information transfer Aim of Training: change behaviour Form good habits and change bad ones –Checked to establish correctness, and provide feedback –Repeat and reinforce sufficiently often to form habit –Checked again after certain time to ensure desired behaviours have been established, and have desired effect
23
23 Changing User Behaviour More than increasing user knowledge: Changing user behaviour beyond their interaction with the system –Motivation –Persuasion –Social norms
24
24 Perceptions of, and attitudes to security Weirich & Sasse, 2001 “How would you describe a person who cares about security?”
25
25 “People who would want to be more secure. I don’t know. That’s really a question for psychologists. What sort of people keep their desks tidy. What sort of people comb their hair in the morning.” “People therefore who are obedient. People who follow the crowd.”
26
26 “So, you could probably be changing your password every week, for no obvious reason apart from your paranoia, whereas I am not terribly paranoid about this sort of thing.”
27
27 Research Challenges: Perceptions & Attitudes Ways of persuading and motivating users to be secure –Appeal to self-interest: Link security to goals that matter to people –Economic impact –Part of professional and ethical conduct –Make threats believable, appear real Changing the image of security –Social marketing Role models, “it’s cool to be secure” Persuasive technology (based on Fogg 2003) –Can we make security fun?
28
28 Task Factors For most people, most of the time, security is enabling task to one or more production tasks Enabling tasks perceived as “hurdles” if relevance to production task not clear Human nature to take short-cuts, especially when workload is felt to be high
29
29 “We in engineering like to leave things fairly unprotected so we can go and access other people’s directories, so if he people I’m working with are changing files, I can work with their latest revisions.” From Fitzgibbons et al., Univ. of Colorado, 2003
30
Example: Passfaces Good recall rates even after long periods of non-use (90+ % after 3 months But: in field trial, Passfaces users only had 30% login frequency of password users Brostoff & Sasse, Procs HCI 2000
31
31 Design for production tasks If competing with production tasks, it will be eliminated/circumvented whenever Security behaviour most fit production tasks –No competing demands for user resources (physical/mental workload) –Cost of keeping out legitimate users: viable contingencies –Performance criteria: Speed, errors
32
32 Research Challenges: Task design Develop security mechanisms that can be configured to match requirements of production tasks Support individuals and organisations to identify and deal with competing goals Resolve locally vs. globally optimal solutions
33
Physical Context Outdoors usage is different –Lighting, pollution, temperature, noise, etc. –Limit performance especially of novel complex mechanisms (such as biometrics) Mobile and handheld systems –Physical ease of use Nomadic use of devices and networks –creates new threats
34
34 Research Challenges: Physical Context Develop usable & secure mechanisms for interaction with pervasive/ubiquitous systems –Specific? –General? –Multiple? Consider implications for –physical & mental workload –economic viability
35
35 Organisational Context Security culture –“Do as I say, not as I do” –Being able to violate “petty” security regulations is badge of seniority Link security into business goals Design of specific (goal- and risk-based) security policies that are enforced, and are seen to be enforced
36
36 Research Challenges: Organisations & Security Integrate security into organisation & business model –Socio-technical design approach –Adapt safety-critical design approaches [e.g. Reason 1990] –Apply risk analysis and economic principles to decision-making about security
37
37 Cultural Context: Trust Social norms, key: trust –People want to trust, and be trusted, but this can create risks [e.g. Mitnick, 2002] –But low-trust systems are expensive to run [Handy, 1985] do not allow building of social capital May be counterproductive since people are only defense against novel attacks.
38
38 Research Challenges: Social Norms Identify social norms that may interfere with desired security behaviour Trust is a key norm Create clear conceptual basis for role of trust in security systems
39
39 Conclusions Usable and effective security needs a systemic approach Security technology, and improving user interfaces to that technology, is by itself not the answer.
40
40 Any Questions? contact: a.sasse@cs.ucl.ac.uk www.ucl.cs.ac.uk/staff/A.Sasse
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.