Presentation is loading. Please wait.

Presentation is loading. Please wait.

February 2009 Framework for Managing the Assured Information Sharing Lifecycle 2008 MURI project with UMBC, Purdue, U. Texas Dallas, U. Illinois, U. Texas.

Similar presentations


Presentation on theme: "February 2009 Framework for Managing the Assured Information Sharing Lifecycle 2008 MURI project with UMBC, Purdue, U. Texas Dallas, U. Illinois, U. Texas."— Presentation transcript:

1 February 2009 Framework for Managing the Assured Information Sharing Lifecycle 2008 MURI project with UMBC, Purdue, U. Texas Dallas, U. Illinois, U. Texas San Antonio, and U. Michigan Objectives: Create a new framework for assured information sharing recognizing that sharable information has a lifecycle of production, release, advertising, discovery, acquisition and use Develop techniques grounded in this model to promote information sharing while maintaining appropriate security, privacy and accountability Evaluate, adapt and improve the AIS concepts and algor- ithms in relevant demonstration systems and test beds See http://aisl.umbc.edu/ for papers and more information

2 February 2009 AIS Lifecycle Approach Design a service oriented architecture to support the assured information sharing lifecycle Create new policy models & languages to express and en- force AIS rules & constraints Develop new data mining techniques and algorithms to track provenance, increase quality and preserve privacy Model underlying organizational social networks to estimate trust and information novelty Design incentive structures to motivate sharing in organizations and coalitions Information value chain information has a lifecycle involving a web of producers and consumers All aspects of the lifecycle are shaped by distributed information sharing policies Integration and mining creates new information that may be shared access may involve negotiating policy defined obligations

3 February 2009 Selected AISL Recent Results ① Progress on models, architectures, languages and mechanisms for trustworthiness-centric assured information sharing (UTSA, Purdue) ② Techniques for resolving conflicting facts extracted from different resources (UIUC) ③ Study of information sharing motivation and quality in online forums (Michigan) ④ Modeling incentives & trust in info. sharing (UTD) ⑤ Learning statistically sound trust metrics (UTD) ⑥ Inferring access policies from logs (UMBC) ⑦ Policies for privacy in mobile information systems (UMBC, Purdue)

4 February 2009 Trustworthiness-centric AIS Framework Objective: create a trustworthiness-centric assured information sharing framework Approach: design models, architectures, language and mechanisms to realize it Key challenges: -Trustworthiness and risk management for end-user decision making -Usage management to extends access control -Attack management, including trustworthiness of infrastructure services -Identity management extending current generation -Provenance management for managing trustworthiness of data, software, and requests 1 1

5 February 2009 trustworthiness-centric assured information sharing framework Trustworthiness management Risk management Usage management (of authorized activities) Identity management (of people, organizations, and devices) Attack management (of unauthorized activities) Provenance management (of data, software, and requests) Note: “trustworthiness  risk” in general 1 1

6 February 2009 Progress on Trustworthiness-centric AIS Initial framework will be published as: S. Xu, R. Sandhu & E. Bertino, Trustworthiness-centric Assured Information Sharing, (invited paper), 3rd IFIP Int. Conf. on Trust Management, 2009 Design for identity & provenance mgmt underway Group-centric info sharing model extends traditional dissemination one with new intuitive metaphors: secure meeting room and subscription service Developed family of security models for semantics of basic group operations (join, leave, add, remove) and proved security properties about them Results published in recent conference papers 1 1

7 2 2 Truth Discovery with Multiple Conflicting Information Providers [TKDE’08] Heuristic Rule 2: A web site that provides mostly true facts for many objects will likely provide true facts for other objects Problem: Multiple information provider may provide conflictive facts on the same object E.g., different author names for a book Which is the true fact? Heuristic Rule 1: The false facts on different web sites are less likely to be the same or similar False facts are often introduced by random factors w1w1 f1f1 f2f2 f3f3 w2w2 w3w3 w4w4 f4f4 f5f5 Web sitesFacts o1o1 o2o2 Objects February 2009

8 Truth-Discovery: Framework Extension Multi-version of truth Democrats vs. republicans may have different views Truth may change with time A player may win first but then lose Truth is a relative, dynamically changing judgment Incremental updates with recent data in data streams Method: Veracity-Stream Dynamic information network mining for veracity analysis in multiple data streams Current Testing Data Sets Google News: A dynamic news feed that provides functions and facilitates to search and browse 4,500 news sources updated continuously 2 2 February 2009

9 Motivation & quality in information sharing Analyzed online Q&A forums: 2.6M questions, 4.6M answers and interviews with 26 top answerers Motivations to contribute include: altruism, learning, competition (via point system) and as a hobby Users who contribute more often and less intermittently contribute higher quality information Users prefer to answer unanswered questions and to respond to incorrect answers See “Questions in, Knowledge iN? A Study of Naver's Question Answering Community”, Nam, Ackerman, Adamic, CHI 2009 Knowledge iN 3 3

10 FEARLESS engineering Incentives & Trust in Assured Information Sharing Goal: Create means of encouraging desirable behavior within an environment which lacks or cannot support a central governing agent Approach: Combining intelligence through a loose alliance –Bridges gaps due to sovereign boundaries –Maximizes yield of resources –Discovery of new information through correlation, analysis of the ‘big picture’ –Information exchanged privately between two participants Drawbacks to sharing include misinformation and freeloading 4 4

11 FEARLESS engineering Our Model Players assumed to be rational The game of information trading –Strategies: be truthful, lie, refuse to participate –One game played for each possible pair of players, all games played simultaneously in a single round; game repeated ‘infinitely’ –Players may verify the information they received with some cost When to verify becomes aspect of game –Always verifying works poorly in light of honest equilibrium behavior but never verifying may yield game to lying opponents Add EigenTrust to game –A distributed trust metric where each player asks others for their opinion of a third –Based on known perfect information 4 4

12 FEARLESS engineering Behaviors Analyzed in Data Sharing Simulations NameStrategyVerification?Punishment?Comments HonestTruthNo Optimistic, maximizes returns DishonestLieNo Takes advantage of other players, trumps Honest in 1 on 1 RandomTruth, LieNo Chaotic, chooses either with equal probability Tit-for-TatTruth, LieAlwaysSpecial Mirrors other players’ actions, starts by selecting Truth LivingAgentTruthTrust-basedNo trading Verifies activity according to trust ratings, will cease activity for number of rounds with player who is caught lying LiarTruth, LieTrust-basedNo trading Identical to LivingAgent but lies with small probability SubtleLieTruth, LieTrust-basedNo trading Identical to Liar, except lies whenever information value reaches certain threshold 4 4

13 FEARLESS engineering Game Matrix Play (agent j)Do Not Play TruthLie Play (Agent i) Truth 0000 Lie 0000 Do Not Play 0000 0000 0000 Value of information Minimal verification probability Cost of Verification Trust value Agent type 4 4

14 FEARLESS engineering Simulation Results We set δ min = 3, δ max = 7, C V = 2 Lie threshold is set 6.9 Honest behavior wins %97 percent of the time if all behaviors exist. Experiments show without LivingAgent behavior, honest behavior cannot flourish. “Incentive and Trust Issues in Assured Information Sharing”, Ryan Layfield, Murat Kantarcioglu, and Bhavani Thuraisingham, International Conference on Collaborative Computing, 2008 4 4

15 February 2009 Learning statistically sound trust scores Goal: Build a statistically sound trust-based scoring system for effective access control through the application of the credit scoring system Approach: Find appropriate predictive variables by applying concepts and methodologies used in credit scoring systems Incorporate a utility function into the scoring system to set up score-related access policies 5 5 Phase 1 Access Request Phase 2 Trust Calculator Phase 3 Trust Policies Phase 4 Access Privilege Phase 5 Interaction Follow-Up Trust-Based Access Control Processes

16 February 2009 Inferring RBAC Policies Problem: A system whose access policy is known is more vulnerable to attacks and insider threat Attackers may infer likely policies from access observations, partial knowledge of subject attributes, and background knowledge Objective: Strengthen policies against discovery Approach: Explore techniques to propose policy theories via machine learning such as ILP Results: promising initial results for simple Role Based Access Control policies 6 6

17 February 2009 Privacy policies for mobile computing Problem: mobile devices collect and integrate sensitive private data about their users which they would like to selectively share with others Objective: Develop a policy-based system for information sharing with an interface enabling end users to write & adapt privacy policies Approach: prototype component for iConnect on an iPhone and evaluate in a University environment Example policy rules: share my exact location with my family; share current activity with my close friends, … 7 7


Download ppt "February 2009 Framework for Managing the Assured Information Sharing Lifecycle 2008 MURI project with UMBC, Purdue, U. Texas Dallas, U. Illinois, U. Texas."

Similar presentations


Ads by Google