Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Carla Brodley, Sonia Fahmy, Cristina Nita-Rotaru, Catherine Rosenberg Current Students: Roman Chertov, Yu-Chun Mao, Kevin Robbins Undergraduate Student:

Similar presentations


Presentation on theme: "1 Carla Brodley, Sonia Fahmy, Cristina Nita-Rotaru, Catherine Rosenberg Current Students: Roman Chertov, Yu-Chun Mao, Kevin Robbins Undergraduate Student:"— Presentation transcript:

1 1 Carla Brodley, Sonia Fahmy, Cristina Nita-Rotaru, Catherine Rosenberg Current Students: Roman Chertov, Yu-Chun Mao, Kevin Robbins Undergraduate Student: Christopher Kanich June 9 th, 2004 DDoS Experiments with Third Party Security Mechanisms

2 2 Year 1 Objectives  Understand the testing requirements of different types of detection and defense mechanisms:  We focus on network-based third party mechanisms  Design, integrate, and deploy a methodology for performing realistic and reproducible DDoS experiments:  Tools to configure traffic and attacks  Tools for automation of experiments, measurements, and effective visualization of results  Integration of multiple software components built by others  Gain insight into the phenomenology of attacks including their first-order and their second-order effects, and impact on detection mechanisms

3 3 Year 1 Accomplishments  Designed and implemented experimental tools (to be demoed):  Automated measurement tools, and routing/security mechanism log processing tools, and graph plotting tools  Automated configuration of interactive and replayed background traffic, routing, attacks, and measurements  Scriptable event system to control and synchronize events at multiple nodes  Installed and configured the following software:  Quagga/Zebra, WebStone, ManHunt, Sentivist  Performed experiments and obtained preliminary results  Generated requirements for DETER to easily support the testing of third party products

4 4 Why Third Party Products? 1. No Insider Information: we do not control or understand the internals of mechanisms, therefore we cannot customize tests. 2. Vendor Neutrality: we have no incentive to design experiments for either success/failure. 3. Requirements for DETER: third party tools were not designed for DETER; therefore, we can uncover setup and implementation challenges for DETER. 4. User Perspective: understanding the effectiveness of popular tools to defend against attacks will benefit many user communities.  Selected mechanisms: Symantec ManHunt v3.0 and Network Flight Recorder (NFR) Sentivist.

5 5 Why ManHunt and Sentivist?  Provide DDoS detection and response  Use coordinated distributed detection sensors  We only test the single sensor configuration now  Available in a software-only form that runs on RedHat Linux.  In contrast, many commercial solutions are available only as hardware boxes (e.g., Mazu Networks Enforcer), and some require Microsoft Windows XP, which makes remotely experimenting with them difficult on the current DETER testbed.  Obtained both ManHunt and Sentivist at no cost.  Mechanisms serve as proof-of-concept for:  Experimental methodology and tools.  Identifying DETER testbed requirements for testing third-party commercial mechanisms.

6 6 Symantec ManHunt Claims  ``Use protocol anomaly detection and traffic monitoring to detect DDoS attacks, including zero- day attacks.’’  ``Provide session termination, traceback capabilities using “FlowChaser,” QoS filters, and handoff responses across domains for DDoS protection.’’  ``Provide the ability to coordinate distributed detection sensors.’’  ``Detection at up to 2 gigabits per second traffic.’’  ``Identifies unknown attacks via analysis engine.’’ Currently, we only focus on ManHunt detection capabilities http://enterprisesecurity.symantec.com/products/products.cfm?ProductID=156&EID=0

7 7 Attacks Studied  Tools like Stracheldraht, TFN, Trinoo only should be sanitized first to ensure that they will not attempt to contact daemons outside the testbed  We experiment with a few recently published attacks:  Tunable randomization of Src and Dst [A. Hussain, J. Heidemann, and C. Papadopoulos. A framework for classifying denial of service attacks. SIGCOMM 2003]  UDP constant/square wave flooding [A. Kuzmanovic and E. W. Knightly. Low-rate targeted denial of service attacks. SIGCOMM 2003]  RST reflection (response to unsolicited ACKs)  ICMP echo request reflection  ICMP echo flooding  SYN flooding with variable rates

8 8 Experimental Goals 1. Identify challenges associated with testing third party products on DETER 2. Identify impact of different attack parameters on application-level and network-level metrics 3. Identify impact of the selection of traffic to train an anomaly detection mechanism on false alarms  How? Our experiments vary:  The mix of attacks  Attack parameters, e.g., on and off periods  Background traffic during the training and testing phases  Security mechanisms: ManHunt, and Sentivist Our current victim is an Apache web server and a subset of its clients

9 9 Experimental Setup  Topology: generated by GT-ITM [Calvert/Zegura, 1996] and adapted to DETER by observing:  Limit of 4 on router degree  Cannot employ power law (cd -  ), small world topologies  Delays and bandwidths consume nodes  Quagga/Zebra [http://www.quagga.net/]: introduces BGP routers that generate dynamic routing traffic  WebStone [http://mindcraft.com/webstone]: creates interactive WWW traffic with 40 clients at 5 sites  File sizes: 500 B, 5 kB, 50 kB, 500 kB, 5 MB with decreasing request frequency  Replayed NZIX traffic from 2 hosts mapped to all hosts [http://pma.nlanr.net/Traces/long/nzix2.html]

10 10 Topology

11 11 Square Wave Experiment  Varies: Square wave attack burst length l  Number/location of attacker(s), attack period T, and rate R were also varied, but results not reported here  Objectives:  Understand attack effectiveness  Identify attack effects on routing  Identify attack effects on application-level and network-level metrics at multiple nodes  Identify when a mechanism starts identifying attacks T-l ll Time Rate R

12 12 Impact on Throughput

13 13 Impact on Routing 2004/06/05 14:24:26 BGP: 10.1.39.3 [Error] bgp_read_packet error: Connection reset by peer 2004/06/05 14:24:43 BGP: 10.1.44.3 sending KEEPALIVE 2004/06/05 14:24:43 BGP: 10.1.44.3 KEEPALIVE rcvd 2004/06/05 14:25:43 BGP: 10.1.44.3 sending KEEPALIVE 2004/06/05 14:25:43 BGP: 10.1.44.3 KEEPALIVE rcvd 2004/06/05 14:25:50 BGP: 10.1.44.3 rcvd UPDATE w/ attr: nexthop 0.0.0.0 2004/06/05 14:25:50 BGP: 10.1.44.3 rcvd UPDATE about 10.0.254.0/24 -- withdrawn 2004/06/05 14:26:29 BGP: 10.1.44.3 rcvd UPDATE w/ attr: nexthop 10.1.24.2 2004/06/05 14:26:29 BGP: 10.1.44.3 rcvd 10.0.254.0/24

14 14 Aggregate Packet Statistics

15 15 Agg. Application-level Metrics

16 16 Demo  RST reflection and tuned square wave attack (60 ms—200 ms)  Objectives:  Illustrate ease of experimental setup with our tool on DETER  Identify attack effects on application-level and network-level metrics at multiple nodes  Identify attack effects on ManHunt  Experiment timeline (in seconds):  0 quagga/zebra router setup  220 host setup  223/224 start WebStone and replay  274 RST reflection begins  474 RST reflection ends  524 square wave begins  674 square wave ends  900 end of demo

17 17 Lessons Learned  Insights into sensitivity to emulation environment  Some effects we observe may not be observed on actual routers and vice versa (architecture and buffer sizes)  Emulab and DETER results significantly differ for the same test scenario (CPU speed)  Limit on the degree of router nodes, delays, bandwidths  Difficulties in testing third party products  Products (hardware or software) connect to hubs, switches, or routers  Layer 2/layer 3 emulation and automatic discovery/allocation can simplify DETER use for testing third party mechanisms  Due to licenses, we need to control machine selection in DETER  Windows XP is required to test some products, e.g., Sentivist administration interface  Difficult to evaluate performance when mechanism is a black box  e.g., cannot mark attack traffic and must solely rely on knowledge of attack

18 18 Plans for Years 2 and 3  Formulate a testing methodology for DETER:  Design increasingly high fidelity experiments and better tools to be made available to the DETER/EMIST teams  Identify simulation/emulation artifacts  Understand the impact of scale, including topology, and statistical properties of traffic  Gain better insight into the phenomenology of attacks/defenses including their second-order effects, and how each is affected by experimental parameters  Develop a taxonomy of testable claims that security mechanisms make, and map each class of claims into realistic experiments and metrics to validate such claims

19 19 Summary  Identified challenges when testing third party mechanisms, providing feedback on requirements to DETER testbed design team  Understood the design of high fidelity experiments (e.g., topology, dynamic routing, interactive traffic)  Contributed to the collection of EMIST/DETER tools: experimental setup, attack mix, and measurement tools  Proved the power of the DETER testbed by presenting a subset of representative experiments


Download ppt "1 Carla Brodley, Sonia Fahmy, Cristina Nita-Rotaru, Catherine Rosenberg Current Students: Roman Chertov, Yu-Chun Mao, Kevin Robbins Undergraduate Student:"

Similar presentations


Ads by Google