Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE USC CSci599 Trusted Computing Lecture Eight.

Similar presentations


Presentation on theme: "Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE USC CSci599 Trusted Computing Lecture Eight."— Presentation transcript:

1 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE USC CSci599 Trusted Computing Lecture Eight – Negotiating Trust and Obligation March 2, 2007 Dr. Tatyana Ryutov University of Southern California Information Sciences Institute

2 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Policies in Trusted Computing Three levels of policy in trusted computing –That enforced by the basic mechanisms –That enforced by the outer rings / applications –That which is determined when creating virtual systems

3 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Policy of Obligation Today we focus on the third –That which is used to determine which components can join a virtual system. –In general, the policy that is enforced at this stage is based on whether the component can enforce the required policy of the second type (i.e. that application specific policies).

4 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE How to Decide How can we determine dynamically if a component can meet the requirements. –Negotiation of obligation –Negotiation of credentials proving ability. –Based in part on trust in the components.

5 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Coverage of Topic Today we discuss techniques for trust negotiation. We also discuss issues on how trust can be assessed as we form virtual relationships. Next week we will step up a level and put some of these techniques into the context of the TVSA.

6 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Current Trends Most Internet interactions involve risk and uncertainty –lack of prior interactions –insufficient information about participants Distribution of risk between parties is often symmetric Shift from attempts to mitigate all potential risks, to accepting threats s intrinsic part of any open system and minimizing the risks by building trust –Trust is antidote to perceived risk: it enables cooperation in the face of uncertainty

7 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Current Trends Traditional identity-based trust relationships –do not scale: require pre-registration, maintaining local accounts –do not provide adequate information for trust decision making ▪most interactions are between strangers ▪decisions must be based on attributes other than identity Attributes/credentials and policies may contain sensitive information and should be treated as protected resources –need a way to decide when sensitive information should be disclosed to other parties

8 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Defining Trust: Some Questions Trust is individual’s opinion (believe) of another entity that can evolve based on available evidence [Josang] Trust is a decision to accept risk faced with positive or negative outcomes of interaction which depends on the actions of the opponent –Complex –Subjective –Dynamic –Multi-dimensional –Computationally usable notion of trust requires a more narrow definition What are the essential conditions of existence of trust? –Existence of benevolent and malicious behavior –Existence of risks involved in trusting the other party Initial vs. continuing trust: Is there any difference?

9 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Defining Trust: Some Questions How to represent core beliefs associated with trust (benevolence, honesty, competence, and predictability) [McKnight et al] ? –Knowledge based: past experience, recommendations, knowledge about entity's nature –Behavioral aspect ▪Signs of good will (not self-serving or opportunistic) –Predictions of future behavior How to establish trust? –Behavioral interdependency, a common objective –Negotiation to find out and agree on what constitutes benevolent/malicious behavior –Sanctions - enforcement of benevolent behavior –Reputation mechanisms

10 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Trust and Reputation Trust represents trustor’s subjective view of an entity's trustworthiness, whereas reputation represents an entity's reputation score as seen by the community Reputation (as well as trust) is context–specific Reputation Systems -Ebay feedback forum, Yahoo! Auction, Amazon -TRELLIS (ISI) -Based on social networks -Issues -Low incentive for providing rating -Change of identities -Dissemination

11 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Modeling Trust in Virtual Organizations Trust reflects intention of a to accept risks based on positive Expectations of the intentions of b to fulfill commitments (obligations) Obligations represent the participant’s commitment to provide a service under certain terms and conditions to other participant Agreement is an explicit declaration of the expected behavior defined in the participants’ obligations, and sanctions in the case of non-conformance to the obligations Suspicion level (distrust) specifies a means of revoking previously agreed trust based on observed behavior expectations a → b obligations a ← b expectations = obligations a ↔ b

12 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Example: Modeling Trust in Cyber Security Testbed Environment Main question: whether a particular experiment should be admitted and what the protection level should be A testbed is risky and uncertain environment –Risks: malicious code may hurt the testbed, interfere with other experiments or escape into the Internet –The sources of uncertainty: ▪testing virulent code with unknown characteristics ▪incomplete knowledge about the “maliciousness” of the code ▪the ability and reliability of the investigators to provide accurate threat assessment ▪subjectivity of judgment We need to reason about uncertainty and risk to apply adequate protection –In security testbed context, trust reflects beliefs that an experiment will behave in a certain way (e.g., as specified in the experiment threat description). –Without trust, an experiment must run in the highest protection mode which increases the cost of the experiment and decreases the testbed sharing –How can one determine a right balance between trust and required protection?

13 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Consider as a TC issue We just discussed how to allow different testbed components to join an experiment. This is very similar to the kinds of issues in allowing a computing component to join a virtual system.

14 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Approach: Trust Threshold We employ fine-grained risk/trust balancing approach To admit an experiment a Trust Threshold (TT) must be reached. TT predicts acceptable outcome. It consists of subjective and objective trust components TT=TS/OS –Subjective (or perceived) trust encompasses two types of trusting attitudes: –indirect trust ▪trusting investigator’s ability to correctly predict code behavior due to perceived qualities of the investigator (e.g., reputation, skills, and profiles), or based on the history of prior interactions with the investigator; ▪trusting code certifications by trusted third parties, e.g., red team, vetted by other testbeds;

15 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Approach: Trust Threshold (2) –direct trust ▪trusting the code: belief that the code will behave as expected because, for example, this is a repeated experiment and the code has been tested before, or one is the creator of the code and is familiar with it. –Objective trust - an intention to trust (run an experiment) formed due to the mechanisms and preventive measures that mitigate expected vulnerabilities introduced by code as well as unexpected threats caused by misbehaving code

16 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Balancing Trust&Risk We associate risk with three dimensions: –type of an event. What can happen? –severity of the potential harm. If it does happen, what are the consequences? –likelihood of an event. How likely is it that the event will happen? For each interaction associate a set of risks of different types –For each of the defined risk types we will specify corresponding likelihood and severity dimensions

17 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Subjective/Objective trust levels balance risk of type i The combination of TS and TO trust levels in the risk of type i attempts to predict an acceptable event ei= for which li≤Li and si≤Si there may be a choice of several objective/subjective trust levels which predict an acceptable event: or the purpose of the trust negotiation is to reach a trust threshold TT i

18 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Trust Negotiation: the process of establishing trust between strangers in open systems based on the attributes of the participants TN supports access control on credentials and policies –Allows requester of service to decide whether to present sensitive credentials/policies to service provider TN is iterative/recursive –Service provider might not want to divulge which credentials are needed until it has confidence in requester –Service provider might be cautions about presenting certain credentials of its own Establish trust incrementally through a sequence of credential disclosures Trust Negotiation (TN)

19 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE TN Example gradual, iterative, and mutual disclosure of credentials and policies –Begin with credentials that are less sensitive. –Build up trust so that more sensitive credentials can be disclosed wine collector Mom’s & Pop’s wine store I want to buy the “Very Rare” bottle of wine Show me your driver’s license along with your credit card number or your VIP member card Here is my driver’s license certificate. I have a credit card. But prove you are member of Better Business Bureau first Here is my Better Business Bureau Certificate Here is my credit card number Your order is placed

20 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Trust Establishment Systems Keynote, PolicyMaker (AT&T) REFEREE (MIT/AT&T) Trust Establishment Framework (IBM) TrustBuilder (BYU) TrustX (Univ. of Milan) SECURE (Cahill et al)

21 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE TrustBuilder Modules –credential verification, policy compliance checker, and a negotiation strategy Negotiation Strategies –Naïve: disclose all credentials with each request for service –Trial and error: disclose all credentials that are not sensitive, disclose sensitive credentials after required trust is established –Informed: disclose relevant policy first, then only disclose credentials necessary for a successful TN based on the trust requirements within the policy

22 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Static, coarse grained policies Vulnerability to Denial of Service (DoS) TN can leak sensitive information Expensive in terms of computation, network bandwidth and number of rounds ▪TN repeats until successful, or all possible avenues exhausted Limitations of Trust Negotiation

23 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE ATNAC is based on: –GAA-API - provides adaptive access control that captures dynamically changing system security requirements –TrustBuilder - regulates when and how sensitive information is disclosed to other parties ATNAC allows us to: –Detect and thwart certain attacks ▪DoS attacks on the TN mechanisms itself ▪Leakage of information through probing for policies. –Support cost effective trust negotiation –Dynamically adapt information disclosure and resource access policies according to suspicion level and general system threat level Adaptive Trust Negotiation and Access Control (ATNAC)

24 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE GAA-API manages: – access control policies – suspicion level (SL) based on (feedback from TB) GAA-API asks TB for necessary credentials and passes SL TrustBuilder manages: – retrieval and verification of client and server credentials – client and server release policies –adjusting internal parameters based on SL, e.g., time-outs (min-turnaround, size-of message) GAA-API/TrustBuilder Integration

25 Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE Separate SL is maintained or each requester (host or user) SL is a vector that consists of: –S DoS indicates suspicion of DoS attack Example: CREDENTIAL_LIMIT_EXCEEDED reported by TrustBuilder when the user sends large number of credentials. –TrustBuilder adjusts timeout for each requester –On the GAA-Apache server, restrict minimum time between consecutive requests from a same user –S IL is attributed to sensitive information leakage attempts. Example: ROLE_EXPRESSION_UNSATISFIABLE occurs when requester can not satisfy the TrustBuilder policy. If alert appears a few times in the row, it could indicate attempt to probe the server policy. ATNAC responds by increasing S IL. –S O indicates other suspicious behavior Example: CERTIFICATE_OWNERSHIP_ERROR occurs when a requester can not prove ownership of the corresponding private key. Possibly, this is an attempt to use a stolen certificate Suspicion Level (SL)


Download ppt "Copyright © 1995-2006 Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE USC CSci599 Trusted Computing Lecture Eight."

Similar presentations


Ads by Google