Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Trust and Privacy in Authorization Bharat Bhargava Yuhui Zhong Leszek Lilien CERIAS Security Center CWSA Wireless Center Department of CS and ECE Purdue.

Similar presentations


Presentation on theme: "1 Trust and Privacy in Authorization Bharat Bhargava Yuhui Zhong Leszek Lilien CERIAS Security Center CWSA Wireless Center Department of CS and ECE Purdue."— Presentation transcript:

1 1 Trust and Privacy in Authorization Bharat Bhargava Yuhui Zhong Leszek Lilien CERIAS Security Center CWSA Wireless Center Department of CS and ECE Purdue University Supported by NSF IIS 0209059, NSF IIS 0242840

2 2 Applications/Broad Impacts Guidelines for the design and deployment of security-sensitive applications in the next- generation networks –Data sharing for medical research and treatment –Collaboration among government agencies for homeland security –Transportation systems (travel security checks, hazardous material disposal) –Collaboration among government officials, law enforcement, security personnel, and health care facilities during bio-terrorism and other emergencies

3 3 Trust-based Authorization Authorization based on: Role Based Access Control model Uncertain evidence Dynamic Trust Authorization process considering: Tradeoff between privacy and trust

4 4 A. Trust-based Authorization Problem –Dynamically establish and maintain trust among entities in an open environment Research directions –Handling uncertain evidence –Modeling dynamic trust Challenges –Uncertain information complicates inference –Subjectivity leads to varying interpretations of the same information –Trust is multi-faceted and context-dependent – hence trust modeling requires tradeoffs: representation comprehensiveness vs. computation simplicity

5 5 Uncertain Evidence Evaluating uncertainty of a role assignment policy given a set of uncertain evidence Probability-based approach –Atomic formula: Bayes network + causal inference + conditional probability interpretation of opinion –AND/OR expressions: rules [J ø sang'01] –Subjectivity handled by discounting operator [Shafer'76]

6 6 Dynamic Trust Trust established based on direct interaction –Identify behavior patterns and their characteristic features –Determine which pattern is the best match for the current interaction sequence –Develop algorithms establishing trust Unique feature: we consider behavior patterns Reputation evaluation –Choose reputation information providers –Scale reputation ratings Bob’s 0.7 means 0.5 to Alice but 0.8 to Carol

7 7 TERA Architecture

8 8 Trust Enhanced Role Assignment (TERA) Prototype Trust enhanced role mapping (TERM) server assigns roles to users based on –Uncertain & subjective evidence –Dynamic trust Reputation server –Dynamic trust information repository –Evaluate reputation from trust information by using algorithms specified by TERM server Prototype and demo are available at http://www.cs.purdue.edu/homes/bb/NSFtrust/

9 9 B. Trading Privacy for Trust Problems –Minimize loss of privacy necessary to gain the required level of trust –Control dissemination of “traded” private data Research directions –Measuring privacy –Modelling privacy - trust tradeoff –Controlling private data dissemination Challenges –Specify policies through metadata and establish guards as procedures –Efficient implementation self-descriptiveness, apoptosis, evaporation –Define context-dependent privacy disclosure policies depending on who will get this information, possible uses of this information, information disclosed in the past, etc. –Propose more universal privacy metrics usually they are ad hoc and customized Details at: http://www.cs.purdue.edu/homes/bb/priv_trust_cerias.ppt

10 10 Privacy Metrics Determine the degree of data privacy –Size-of-anonymity-set metrics –Entropy-based metrics Privacy metrics should account for: –Dynamics of legitimate users –Dynamics of violators –Associated costs

11 11 Privacy-Trust Tradeoff Gain required level of trust with minimal privacy loss Build trust based on digital users’ credentials that contain private information Formulate the privacy-trust tradeoff problem Estimate privacy loss due to disclosing a set of credentials Estimate trust gain due to disclosing credentials Develop algorithms that minimize privacy loss for required trust gain

12 12 Controlling Private Data Dissemination  Design self-descriptive private objects  Construct a mechanism for apoptosis of private objects apoptosis = clean self-destruction  Develop proximity-based evaporation of private objects

13 13 Examples of one-dimensional distance metrics –Distance ~ business type –Distance ~ distrust level: more trusted entities are “closer” Multi-dimensional distance metrics –Security/reliability as one of dimensions Examples of Proximity Metrics Insurance Company B 5 1 5 5 2 2 1 2 Bank I - Original Guardian Insurance Company C Insurance Company A Bank II Bank III Used Car Dealer 1 Used Car Dealer 2 Used Car Dealer 3 If a bank is the original guardian, then: -- any other bank is “closer” than any insurance company -- any insurance company is “closer” than any used car dealer

14 14 Private and Trusted System (PRETTY) Prototype (1) [2a] (3) User Role [2b] [2d] [2c1] [2c2] (2) (4) TERA = Trust-Enhanced Role Assignment

15 15 Information Flow in PRETTY 1)User application sends query to server application. 2)Server application sends user information to TERA server for trust evaluation and role assignment. a)If a higher trust level is required for query, TERA server sends the request for more user’s credentials to privacy negotiator. b)Based on server’s privacy policies and the credential requirements, privacy negotiator interacts with user’s privacy negotiator to build a higher level of trust. c)Trust gain and privacy loss evaluator selects credentials that will increase trust to the required level with the least privacy loss. Calculation considers credential requirements and credentials disclosed in previous interactions. d)According to privacy policies and calculated privacy loss, user’s privacy negotiator decides whether or not to supply credentials to the server. 3)Once trust level meets the minimum requirements, appropriate roles are assigned to user for execution of his query. 4)Based on query results, user’s trust level and privacy polices, data disseminator determines: (i) whether to distort data and if so to what degree, and (ii) what privacy enforcement metadata should be associated with it.


Download ppt "1 Trust and Privacy in Authorization Bharat Bhargava Yuhui Zhong Leszek Lilien CERIAS Security Center CWSA Wireless Center Department of CS and ECE Purdue."

Similar presentations


Ads by Google