Presentation is loading. Please wait.

Presentation is loading. Please wait.

Trust Jason Chalecki Usable Privacy and Security – Spring 2006.

Similar presentations


Presentation on theme: "Trust Jason Chalecki Usable Privacy and Security – Spring 2006."— Presentation transcript:

1 Trust Jason Chalecki Usable Privacy and Security – Spring 2006

2 Not much trust  e-commerce sites 29% trust either "just about always" or "most of the time" 64% trust "only some of the time" or "never"  consumer advice sites 33% trust 59% low levels of trust

3 An online problem?  small businesses 68% trust  newspapers and television news 58% trust  financial companies such as banks, insurance companies and stockbrokers 55% trust  charities and other nonprofit organizations 54% trust  federal government 47% trust at least most of the time

4 From A Matter of Trust: What Users Want From Web Sites

5 Lost or lacking trust  Napster (2003) Very long pauses between songs. I dropped the service and haven’t been back, even though, when it worked, I loved it.  Jakob Nielsen (Alertbox 1999) Would probably love the eFax service, but didn’t sign up because he would be locked in.  Amazon.com (1999) They admitted that many favorable reviews had been paid for But the flawed policy was terminated and the damage to the customer relationship was mended by an offer to refund any purchase that had been based on a paid recommendation.

6 Trust is fundamental to security  Lack of trust results in systems being ill-used or used not at all  Lack of understanding of trust results in wrong decisions or no decisions  Too much trust can be more dangerous than too little E.g. I can open any file attachment because I run anti-virus software

7 Fundamental questions  How to reliably represent trust in different interactions and interfaces  How to transform trust-based decisions into security decisions while maintaining the meaning of the trust-based decisions  What are the building blocks of trust  How is trust fallible  How can trust’s fallibility be addressed

8 Definition  assured reliance on the character, ability, strength, or truth of someone or something (Merriam-Webster)  Concerns a positive expectation regarding the behavior of somebody or something in a situation that entails risk to the trusting party (Patrick, Briggs, and Marsh)

9 Layers  Dispositional trust Psychological disposition or personality trait to be trusting or not  Learned trust A person’s general tendency to trust, or not to trust, as a result of experience  Situational trust Basic tendencies are adjusted in response to situational cues

10 Granularity  I trust you  I trust you this much  I trust you this much to do this thing

11 Another axis  Hard trust technology  Soft trust social

12 Processing strategies  Heuristic approach making quick judgments from the obvious information  Systematic approach involving detailed analysis of information

13 Credibility  How is this different than trust?

14 Credibility  How is this different than trust?  Credibility is believability  Trust is dependability

15 “Credibility and Computing Technology”  Four Types of Credibility Presumed credibility. Reputed credibility. Surface credibility. Experienced credibility.

16 Presumed credibility  Belief based on general assumptions

17 Reputed credibility  Belief based on third-party reports

18 Surface credibility  Belief based on simple inspection

19 Experienced credibility  Belief based on one’s own experience

20 “Credibility and Computing Technology”  Four Types of Credibility Presumed credibility. Reputed credibility. Surface credibility. Experienced credibility.  How do these relate to the layers of trust?

21 Judgments of credibility  Prominence Involvement of the user Topic of the web site Nature of the user’s task User’s experience Individual differences  Interpretation Assumptions in a user’s mind Skills and knowledge possessed by user Context for the user

22 Time  Initial trust  Interactions  Long-term trusted relationship

23 Trustworthiness  Ability Capacity to keep promises  Integrity Actually keeping promises  Benevolence Acting in another’s best interest

24 Bhattacherjee’s Model FamiliarityTrust Willingness to Transact + + +

25 Lee, Kim, & Moon’s Model Comprehensive Information Shared Value Communication UncertaintyNumber of Competitors Specificity TrustTransaction Cost Customer Loyalty + + - + + + - - +

26 Corritore’s Model Credibility Ease of Use Risk External Factors Trust Perception of:

27 Egger’s Model (revised)

28 McKnight’s Model Disposition to Trust Institution-Based Trust (perceptions of the Internet environment) Trust Beliefs (perceptions of specific web vendor attributes) Trusting Intentions (intention to engage in trust-related behaviors with a specific web vendor) Trust-Related Behaviors

29 Riegelsberger’s Model TRUSTER TRUSTEE Outside Option Withdrawal Fulfillment Separation in Space +UNCERTAINTY Separation in Time +UNCERTAINTY Trusting Action Signals Nonfulfillment

30 Models Comparison  Can be successfully operationalized, typically into questionnaires  Components of trust Ability Integrity Benevolence  Many factors may affect trust

31 Trust Design Guidelines 1. Ensure good ease of use. 2. Use attractive design. 3. Create a professional image – avoid spelling mistakes and other simple errors. 4. Don’t mix advertising and content – avoid sales pitches and banner advertisements. 5. Convey a “real-world” look and feel – for example, with use of high-quality photographs of real places and people. 6. Maximize the consistency, familiarity, or predictability of an interaction both in terms of process and visually. 7. Include seals of approval such as TRUSTe. 8. Provide explanations, justifying the advice or information given. 9. Include independent peer evaluation such as references from past and current users and independent message boards. 10. Provide clearly stated security and privacy statements, and also rights to compensation and returns. 11. Include alternative views, including good links to independent sites with the same business area. 12. Include background information such as indicators of expertise and patterns of past performance. 13. Clearly assign responsibilities (to the vendor and the customer). 14. Ensure that communication remains open and responsive, and offer order tracking or an alternative means of getting in touch. 15. Offer a personalized service that takes account of each client’s needs and preferences and reflects its social identity.

32 Stanford Guidelines for Web Credibility 1. Make it easy to verify the accuracy of the information on your site. 2. Show that there's a real organization behind your site. 3. Highlight the expertise in your organization and in the content and services you provide. 4. Show that honest and trustworthy people stand behind your site. 5. Make it easy to contact you. 6. Design your site so it looks professional (or is appropriate for your purpose). 7. Make your site easy to use – and useful. 8. Update your site's content often (at least show it's been reviewed recently). 9. Use restraint with any promotional content (e.g., ads, offers). 10. Avoid errors of all types, no matter how small they seem. Stanford Persuasive Technology Lab http://www.webcredibility.org/guidelines/

33 Jakob Nielsen’s Guidelines  Design quality  Up-front disclosure  Comprehensive, correct, and current  Connected to the rest of the Web Trust or Bust: Communicating Trustworthiness in Web Design Jakob Nielsen's Alertbox, March 7, 1999 http://www.useit.com/alertbox/990307.html

34 Guidelines Comparison  Professional appearance and ease of use are very important  Be correct and verifiable  Be part of a larger community

35 Microsoft and Users and Trust

36 Trust Question Failings  Often, the question being presented is a dilemma rather than a decision  Computers can’t help interpret emotional cues because they behave in a purely logical way  Users don’t want to deal with the trust issues presented to them  Users don’t want to reveal personal data

37 User Behavior  What users say they do and what they actually do often differ  Users don’t necessarily want to think about the consequences of their behavior  Users make one-off decisions about trust  Users conceive of security and privacy issues differently than developers do  Users have many superstitions about how viruses are propagated

38 Before XP SP2

39 XP SP2

40 Help for “downloading” decision

41 Help for “running” decision

42 Recommendations  Let users make trust decisions in context  Make the most trusted option the default selection  Present users with choices, not dilemmas  Always respect the user’s decision


Download ppt "Trust Jason Chalecki Usable Privacy and Security – Spring 2006."

Similar presentations


Ads by Google