Presentation is loading. Please wait.

Presentation is loading. Please wait.

01/04/2006ecs236 winter 20061 Intrusion Detection ecs236 Winter 2006: Intrusion Detection #3: Anomaly Detection Dr. S. Felix Wu Computer Science Department.

Similar presentations


Presentation on theme: "01/04/2006ecs236 winter 20061 Intrusion Detection ecs236 Winter 2006: Intrusion Detection #3: Anomaly Detection Dr. S. Felix Wu Computer Science Department."— Presentation transcript:

1 01/04/2006ecs236 winter 20061 Intrusion Detection ecs236 Winter 2006: Intrusion Detection #3: Anomaly Detection Dr. S. Felix Wu Computer Science Department University of California, Davis http://www.cs.ucdavis.edu/~wu/ sfelixwu@gmail.com

2 01/04/2006ecs236 winter 20062 Intrusion Detection Intrusion Detection Model Input event sequence Results Pattern matching

3 01/04/2006ecs236 winter 20063 Scalability of Detection l Number of signatures, amount of analysis l Unknown exploits/vulnerabilities

4 01/04/2006ecs236 winter 20064 Anomaly vs. Signature l Signature Intrusion (Bad things happen!!) –Misuse produces observable bad effect –Specify and look for bad behaviors l Anomaly Intrusion (Good things did not happen!!) –We know what our normal behavior is –Looking for an deviation from the normal behavior, raise early warning

5 01/04/2006ecs236 winter 20065 Reasons for “AND” l Unknown attacks (insider threat) l Better scalability –AND  target/vulnerabilities –SD  exploits

6 01/04/2006ecs236 winter 20066 Another definition… l Signature-based detection –Predefine the signatures of anomalies –Pattern matching l Statistics-based detection –Build statistics profile for expected behaviors –Compare testing behaviors with expected behaviors –Significant deviation Convert our limited/partial understanding/modeling about the target system or protocol into detection heuristics (i.e., BUTTERCUP signatures) Based on our experience, select a set of “features” that will likely to distinguish expected from unexpected behavior.

7 01/04/2006ecs236 winter 20067 What is “vulnerability”?

8 01/04/2006ecs236 winter 20068 What is “vulnerability”? Signature Detection create “effective/strong/scaleable” signatures Anomaly Detection detect/discover “unknown vulnerabilities”

9 01/04/2006ecs236 winter 20069 AND (ANomaly Detection) l Unknown Vulnerabilities/Exploits l Insider Attacks l Understand How and Why these things happened l Understand the limit of AND from both sides

10 01/04/2006ecs236 winter 200610 What is an anomaly?

11 01/04/2006ecs236 winter 200611 For each sample of the statistic measure, X (0, 1] 40% (1, 3] 30% (3, 15] 20% (15, +  ) 10% Input Events SAND

12 01/04/2006ecs236 winter 200612 quantify the anomalies alarm generationthreshold control raw events long term profile 051015202530 0 “But, which feature(s) to profile??” function F

13 01/04/2006ecs236 winter 200613 What is an anomaly? Events Expected Behavior Model Anomaly Detection

14 01/04/2006ecs236 winter 200614 What is an anomaly? Events Expected Behavior Model Anomaly Detection Knowledge about the Target

15 01/04/2006ecs236 winter 200615 Model vs. Observation the ModelAnomaly Detection Conflicts  Anomalies It could be an attack, but it might well be misunderstanding!!

16 01/04/2006ecs236 winter 200616 Statistic-based ANomaly Detection (SAND) l choose a parameter (a random variable hopefully without any assumption about its probabilistic distribution) l record its statistical “long-term” profile l check how much, quantitatively, its short- term behavior deviates from its long term profile l set the right threshold on the deviation to raise alarms

17 01/04/2006ecs236 winter 200617 decay update clean compute the deviation alarm generationthreshold control timer control raw events long term profile 051015202530 0

18 01/04/2006ecs236 winter 200618 Statistical Profiling n Long-Term profile: u capture long-term behavior of a particular statistic measure u e.g., update once per day u half-life: 30 updates F recent 30: 50% F 31-60: 25% F the newer contributes more

19 01/04/2006ecs236 winter 200619 Statistical Pros and Cons l Slower to detect - averaging window l Very good for unknown attacks - as long as “relevant measures” are chosen l Environment (protocol, user, etc) dependency –Need good choices on statistical measures –Statistical profiles might be hard to build –Thresholds might be hard to set

20 01/04/2006ecs236 winter 200620 Long-term Profile l Category, C-Training 4learn the aggregate distribution of a statistic measure l Q Statistics, Q-Training 4learn how much deviation is considered normal l Threshold

21 01/04/2006ecs236 winter 200621 Long-term Profile: C-Training For each sample of the statistic measure, X (0, 50] 20% (50, 75] 30% (75, 90] 40% (90, +  ) 10% l k bins l Expected Distribution, P 1 P 2... P k, where l Training time: months

22 01/04/2006ecs236 winter 200622 Long-term Profile: Q-Training (1) For each sample of the statistic measure, X (0, 50] 20% (50, 75] 40% (75, 90] 20% (90, +  ) 20% l k bins, samples fall into bin l samples in total ( ) l Weighted Sum Scheme with the fading factor  s

23 01/04/2006ecs236 winter 200623 Threshold l Predefined threshold,  l If Prob(Q>q) < , raise alarm

24 01/04/2006ecs236 winter 200624 Long-term Profile: Q-Training (2) l Deviation: 4Example: l Q max 4the largest value among all Q values

25 01/04/2006ecs236 winter 200625 Long-term Profile: Q-Training (3) l Q Distribution 4[0, Qmax) is equally divided into 31 bins and the last bin is [Qmax, +  ) 4distribute all Q values into the 32 bins

26 01/04/2006ecs236 winter 200626 Weighted Sum Scheme l Problems of Sliding Window Scheme 4Keep the most recent N pieces of audit records 4required resource and computing time are O(N) l Assume 4K: number of bins 4Y i : count of audit records falls into i th bin 4N: total number of audit records 4  : fading factor l When E i occurs, update

27 01/04/2006ecs236 winter 200627 FTP Severs and Clients FTP Client SHANG FTP Servers Heidelberg NCU SingNet UIUC

28 01/04/2006ecs236 winter 200628 Q-Measure l Deviation: 4Example: l Q max 4the largest value among all Q values

29 01/04/2006ecs236 winter 200629

30 01/04/2006ecs236 winter 200630 Threshold l Predefined threshold,  l If Prob(Q>q) < , raise alarm False positive

31 01/04/2006ecs236 winter 200631

32 01/04/2006ecs236 winter 200632 Mathematics l Many other techniques: –Training/learning –detection

33 01/04/2006ecs236 winter 200633 decay update clean compute the deviation alarm generationthreshold control timer control raw events long term profile 051015202530 0

34 01/04/2006ecs236 winter 200634 Dropper Attacks P% Per(K,I,S) Ret(K,S) Ran(K) Intentional or Unintentional??

35 01/04/2006ecs236 winter 200635 Periodical Packet Dropping l Parameters (K, I, S) 4K, the total number of dropped packets in a connection 4I, the interval between two consecutive dropped packets 4S, the position of the first dropped packet. l Example (5, 10, 4) 45 packets dropped in total 41 every 10 packets 4start from the 4 th packet 4The 4 th, 14 th, 24 th, 34 th and 44 th packet will be dropped

36 01/04/2006ecs236 winter 200636 Retransmission Packet Dropping l Parameters (K, S) 4K, the times of dropping the packet's retransmissions 4S, the position of the dropped packet l Example (5, 10) 4first, drops the 10 th packet 4then, drops the retransmissions of the 10 th packet 5 times

37 01/04/2006ecs236 winter 200637 Random Packet Dropping l Parameters (K) 4K, the total number of packets to be dropped in a connection l Example (5) 4randomly drops 5 packets in a connection

38 01/04/2006ecs236 winter 200638 Experiment Setting FTP Internet Divert Socket FTP Client xyz.zip 5.5M FTP Server Attack Agent Data Packets

39 01/04/2006ecs236 winter 200639 Impacts of Packet Dropping On Session Delay

40 01/04/2006ecs236 winter 200640 Compare Impacts of Dropping Patterns PerPD: I=4, S=5 RetPD: S=5

41 01/04/2006ecs236 winter 200641 bone fire redwing light 152.1.75.0 192.168.1.0 172.16.0.0 UDP flood FTP data TFN agents TFN target FTP client FTP server congestion air TFN master

42 01/04/2006ecs236 winter 200642

43 01/04/2006ecs236 winter 200643 TDSAM Experiment Setting FTP Internet Divert Socket FTP Client xyz.zip 5.5M FTP Server Attack Agent TDSAM Data Packets p1, p2, p3, p5, p4 max reordering counting

44 01/04/2006ecs236 winter 200644

45 01/04/2006ecs236 winter 200645

46 01/04/2006ecs236 winter 200646 Results: Position Measure

47 01/04/2006ecs236 winter 200647 Results: Delay Measure

48 01/04/2006ecs236 winter 200648 Results: NPR Measure

49 01/04/2006ecs236 winter 200649 Results (good and bad) l False Alarm Rate 4less than 10% in most cases, the highest is 17.4% l Detection Rate 4Position: good on RetPD and most of PerPD > at NCU, 98.7% for PerPD(20,4,5), but 0% for PerPD(100, 40, 5) in which dropped packets are evenly distributed 4Delay: good on those significantly change session delay, e.g., RetPD, PerPD with a large value of K > at SingNet, 100% for RetPD(5,5), but 67.9% for RanPD(10) 4NPR: good on those dropping many packets > at Heidelberg, 0% for RanPD(10), but 100% for RanPD(40)

50 01/04/2006ecs236 winter 200650 Performance Analysis l Good sites correspond to a high detection rate. 4stable and small session delay or packet reordering 4e.g., using Delay Measure for RanPD(10): UIUC (99.5%) > Heidelberg(74.5%) > SingNet (67.9%) > NCU (26.8%) l How to choose the value of nbin is site-specific 4e.g., using Position Measure, lowest false alarm rate occurs when nbin= 5 at Heidelberg(4.0%) and NCU(5.4%), 10 at UIUC(4.5%) and 20 at SingNet(1.6%)

51 01/04/2006ecs236 winter 200651 decay update clean compute the deviation alarm generationthreshold control timer control raw events long term profile 051015202530 0

52 01/04/2006ecs236 winter 200652 decay update clean cognitively identify the deviation alarm identification Information Visualization Toolkit raw events cognitive profile

53 01/04/2006ecs236 winter 200653 What is an anomaly?

54 01/04/2006ecs236 winter 200654 What is an anomaly? l The observation of a target system is inconsistent, somewhat, with the expected conceptual model of the same system

55 01/04/2006ecs236 winter 200655 What is an anomaly? l The observation of a target system is inconsistent, somewhat, with the expected conceptual model of the same system l And, this conceptual model can be ANYTHING. –Statistical, logical, or something else

56 01/04/2006ecs236 winter 200656 Model vs. Observation the ModelAnomaly Detection Conflicts  Anomalies It could be an attack, but it might well be misunderstanding!!

57 01/04/2006ecs236 winter 200657 The Challenge Events Expected Behavior Model Anomaly Detection Knowledge about the Target False Positives & Negatives

58 01/04/2006ecs236 winter 200658 Challenge l We know that the detected anomalies can be either true-positive or false-positive. l We try all our best to resolve the puzzle by examining all information available to us. l But, the “ground truth” of these anomalies is very hard to obtain –even with human intelligence

59 01/04/2006ecs236 winter 200659 Problems with AND l We are not sure about whatever we want to detect… l We are not sure either when something is caught… l We are still in the dark… at least in many cases…

60 01/04/2006ecs236 winter 200660 Anomaly Explanation l How will a human resolve the conflict? l The Power of Reasoning and Explanation –We detected something we really want to detect  reducing false negative –Our model can be improved  reduce false positive

61 01/04/2006ecs236 winter 200661 Without Explanation l AND is not as useful?? l Knowledge is the power to utilize information! –Unknown vulnerabilities –Root cause analysis –Event correlation

62 01/04/2006ecs236 winter 200662 Anomaly Explanation the ModelAnomaly Detection Anomaly Analysis and Explanation EBL Explaining both the attack and the normal behavior

63 01/04/2006ecs236 winter 200663 Explanation Simulation Experiments Or Observatinon Conflicts  Anomalies

64 01/04/2006ecs236 winter 200664 the Model model-based event analysis observed system events SBL-based Anomaly Detection analysis reports Example Selection Explanation Based Learning model update

65 01/04/2006ecs236 winter 200665 AND  EXPAND l Anomaly Detection –Detect –Analysis and Explanation –Application

66 01/04/2006ecs236 winter 200666

67 01/04/2006ecs236 winter 200667

68 01/04/2006ecs236 winter 200668

69 01/04/2006ecs236 winter 200669


Download ppt "01/04/2006ecs236 winter 20061 Intrusion Detection ecs236 Winter 2006: Intrusion Detection #3: Anomaly Detection Dr. S. Felix Wu Computer Science Department."

Similar presentations


Ads by Google