Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 CS 6910: Advanced Computer and Information Security Lecture on 11/2/06 Trust in P2P Systems Ahmet Burak Can and Bharat Bhargava Center for Education.

Similar presentations


Presentation on theme: "1 CS 6910: Advanced Computer and Information Security Lecture on 11/2/06 Trust in P2P Systems Ahmet Burak Can and Bharat Bhargava Center for Education."— Presentation transcript:

1 1 CS 6910: Advanced Computer and Information Security Lecture on 11/2/06 Trust in P2P Systems Ahmet Burak Can and Bharat Bhargava Center for Education and Research in Information Assurance and Security (CERIAS) and Department of Computer Sciences Purdue University

2 2 Trust in P2P Systems Outline 1) Introduction 1.1) Mitigating Attacks in P2P Systems 1.2) Assumptions for Peer Interactions 2) Contexts of Trust in P2P Systems 3) Definitions for the Proposed Solution 4) Trust Metrics 5) Trust-based Decisions 6) Interaction Evaluation by Peers 7) Recommendation Evaluation by Peers 8) Simulation Experiments 8.1) Attacker Models for Simulation: Individual attackers/ Collaborators / Pseudospoofers 8.2) Experimental Results

3 3 1) Introduction 1.1) Mitigating Attacks in P2P Systems Mitigating attacks in a malicious P2P environment Use trust relationships among peers to mitigate attacks in a malicious P2P environment Algorithms are needed to establish trust among peers Research tasks: Propose trust metrics that reflect all aspects of trust. Develop distributed algorithms to manage trust relationships among peers and help them to make decisions using trust metrics Define methods to evaluate interactions and trust information exchanged among peers (recommendations)

4 4 1.2) Assumptions for Peer Interactions Peers use no a priori information to establish trust No pre-existing trust relationships among peers A peer must contribute and behave well to gain and preserve trust of another peer Malicious behavior of Peer 1 against Peer 2 can easily destroy trust of Peer 2 in Peer 1 Trust metrics should have sufficient precision Required to rank peers accurately (according their trustworthiness)

5 5 2) Contexts of Trust in P2P Systems Two contexts of trust — w.r.t. performing 2 different tasks: 1)Providing services to other peers 2)Giving recommendations to other peers. These contexts considered separately A peer might simultaneously be a good service provider and a bad recommender (or vice versa)

6 6 3) Definitions for the Proposed Solution A peer becomes an acquaintance of another peer after providing it a service (e.g., uploading a file) Using a service from a peer is called a service interaction All peers are strangers to each other at the start A peer expands its set of acquaintances by using services from strangers A recommendation represents the acquaintance’s trust information about a stranger A peer requests recommendations about a stranger only from its acquaintances Receiving a recommendation from an acquaintance is a recommendation interaction

7 7 4) Trust Metrics (1) Reputation is the primary metric when deciding about strangers in the service context Recommendations from acquaintances used to calculate reputation metric Service trust is a metric to measure trustworthiness of a peer in the service context A service provider is selected according to service trust and reputation metric Service trust metric of a peer calculated based on its past service interactions and its reputation

8 8 4) Trust Metrics (2) Recommendation trust is the primary metric to measure trustworthiness of a peer in the recommendation context I.e., when selecting recommenders and evaluating recommendations Recommendation trust metric of a peer calculated based on past recommendation interactions and its reputation Analogously to service trust metric

9 9 5) Trust-based Decisions (1) When making trust decisions, interactions and reputation are considered separately This helps when making a distinction between two trustworthy peers Trust decisions about a stranger are based on reputation Trust decisions about an acquaintance are based on its past interactions and reputation As more interactions happen with an acquaintance, the experience derived through interactions becomes more important than its reputation

10 10 5) Trust-based Decisions (2) Using available acquaintances by a peer If no acquaintances - simply trust any stranger providing the requested service If some acquaintances - calculate reputation of strangers based on recommendations of acquaintances May select one of the strangers May choose not to entrust strangers if acquaintances can deliver the needed service As more acquaintances become available – can become more selective

11 11 6) Interaction Evaluation by Peers Using all available information about interactions is helpful to calculate trust metrics more precisely A peer should be able to express its level of satisfaction about an interaction Considering several parameters E.g., online/offline periods, bandwidth, delay of the uploader in a file download operation Service interactions might have varying importance E.g., downloading a large file more important than downloading a small file The effect of an interaction on trust calculation fades as new interactions occur

12 12 7) Recommendation Evaluation by Peers A recommendation makes a clear distinction between the recommender’s own experience and second-hand information collected from its acquaintances This distinction enables more precise calculation of reputation A recommendation contains the recommender’s level of confidence in the information provided If the recommender has a low confidence, the recommendation is weak A weak recommendation’s effect on the calculated reputation value is less than a strong one A recommending peer is no more liable than its confidence in its recommendation A recommendation from Peer 2 (the recommender) is evaluated by Peer 1 based on the value of recommendation trust metric that Peer 1 has for Peer 2

13 13 8) Simulation Experiments A file sharing application was simulated To understand the proposed algorithms for mitigating attacks related to services and recommendations The results of several empirical studies are used to simulate peer, resource, and network parameters Some of the simulation parameters: Peer capabilities: bandwidth, number of shared files Peer behavior: online/offline periods, waiting time for sessions Resource distribution: file sizes, popularity of files Considered attack scenarios: Individual, collaborative and pseudonym changing attacks scenarios Simulated nine different malicious behaviors

14 14 8.1) Attacker Models for Simulation 2 types of attacks: 1) Service-based attack — uploading a virus infected or inauthentic file 2) Recommendation-based attack — giving misleading recommendations Two subtypes of misleading recommendations: Unfairly high recommendation: Giving a positively-biased trust value about the recommended peer Unfairly low recommendation: Giving a negatively-biased trust value about the recommended peer Three types of attackers: a)Individual attackers b)Collaborators c)Pseudospoofers

15 15 a) Model of Individual Attackers Individual attackers — perform attacks independently (does not cooperate with other attackers) Three individual attacker behaviors: Naïve attacker — always uploads infected/inauthentic files and gives unfairly low recommendations to others Discriminatory attacker — attacks a selected group of victims Always uploads infected/inauthentic files to them and gives unfairly low recommendations for them It treats all other peers fairly Hypocritical [LL: better: ”Probabilistic”] attacker — uploads infected/inauthentic files and gives unfairly low recommendations with x% probability

16 16 b) Model of Collaborators Collaborators — malicious peers that coordinate attacks with other peers Collaborators never attack each other Always upload authentic files to each other Always give fair recommendations to other collaborators Collaborators always give unfairly high recommendations about each other to non-collaborating peers Try to convince good peers to download files from any one of the collaborators Three collaborator behaviors (analogous as for individual attackers) Naïve, Hypocritical, Discriminatory

17 17 c) Model of Pseudospoofers Pseudospoofer [LL: “pseudonym changers”] — a malicious peer which changes its pseudonym periodically to escape from being identified A pseudospoofer behaviors: Naïve / discriminatory / hypocritical Analogous to individual attacker behaviors

18 18 8.2) Experimental Results In a non-malicious network, reputation of a peer is proportional to its capabilities such as network bandwidth, average online period on the network and number of shared resources In a malicious network, service and recommendation- based attacks affect reputation of a peer

19 19 a) Results for Individual Attackers All attacks of individual attackers are mitigated easily Hypocritical (probabilistic) attacks take more time to detect than other individual attackers

20 20 b) Results for Collaborators (1) Detection of collaborators usually takes longer than detection of an individual attacker Unfairly high recommendations provides an advantage except naïve collaborators Naïve collaborators do not benefit from collaboration They have zero reputation since they can not complete any service interaction Hence they are not requested for any recommendations Collaboration is partially successful in hypocritical and discriminatory behaviors

21 21 b) Results for Collaborators (2) Hypocritical (probabilistic) collaborators succeeded to launch more service-based attacks at the start of experiments At the start, good peers do not have many acquaintances - collaborators deceive them easily by distributing unfairly high recommendations for each other Then collaborators able to take advantage of unfairly heightened reputations to attract good peers for their “services” (= attacks) As good peers gain more good acquaintances, hypocritical collaborators are identified (and their attacks mitigated)

22 22 b) Results for Collaborators (3) Service-based attacks of discriminatory collaborators are mitigated easier than those of hypocritical ones Victims of discriminatory collaborators quickly identify them But discriminatory collaborators gained a high recommendation trust value & were able to continue distributing misleading recommendations Collaborators do not attack most good peers Thus, good peers believe their recommendations Victims give low recommendations for discriminatory collaborators However, good peers think that victims are giving misleading recommendations for discriminatory collaborators Thus, discriminatory collaborators are able to continue distributing misleading recommendations

23 23 c) Results for Pseudospoofers Attacks of pseudospoofers (pseudonym changers) are as easily mitigated as those of individual attackers Peers gain more acquaintances and have less tendency to select strangers with time Thus, pseudospoofers are more isolated from good peers after each pseudonym change Experimental results for Pseudospoofers

24 24 d) Experim. Results – General Remarks Defining a context of trust increases a peer's ability to identify and mitigate attacks on the context-related tasks Recall: trust contexts: 1)Trust w.r.t. providing services to other peers 2)Trust w.r.t. giving recommendations to other peers Context of trust can be used to increase a peer’s reasoning ability for different tasks Such as routing, integrity checking and protecting privacy

25 25 THE END


Download ppt "1 CS 6910: Advanced Computer and Information Security Lecture on 11/2/06 Trust in P2P Systems Ahmet Burak Can and Bharat Bhargava Center for Education."

Similar presentations


Ads by Google