Download presentation

Presentation is loading. Please wait.

Published bySheldon Warwick Modified about 1 year ago

1
Learning Rules from Data Olcay Taner Yıldız, Ethem Alpaydın Department of Computer Engineering Boğaziçi University

2
Rule Induction? Derive meaningful rules from data Mainly used in classification problems Attribute types (Continuous, Discrete) Name IncomeOwns a house?Marital status Ali 25,000 $YesMarried Veli18,000 $ No Married Default No Yes

3
Rules Disjunction: conjunctions are binded via OR's Conjunction: propositions are binded via AND's Proposition: relation on an attribute Attribute is Continuous (defines a subinterval) Attribute is Discrete (= one of the values of the attribute)

4
How to generate rules? Rule Induction Techniques Via Trees C4.5Rules Directly from Data Ripper

5
C4.5Rules (Quinlan, 93) Create decision tree using C4.5 Convert the decision tree to a ruleset by writing each path from root to the leaves as a rule

6
C4.5Rules x 2 : savings x 1 : yearly-income 11 OK DEFAULT 22 x 1 > 1 x 2 > 2 y = 0 y = 1 y = 0 yes no yes Rules: IF (x 1 > 1 ) AND (x 2 > 2 ) THEN Y = OK IF (x 1 1 ) AND (x 2 < 2 )) THEN Y = DEFAULT

7
RIPPER (Cohen, 95) Learn rule for each class Objective class is positive, other classes are negative Two Steps Initialization Learn rules one by one Immediately prune learned rule Optimization Since search is greeedy, pass k times over the rules to optimize

8
Split (pos, neg) into growset and pruneset Rule := grow conjunction with growset Add propositions one by one IF ( x 1 > 1 ) AND (x 2 > 2 ) AND (x 2 < 3 ) THEN Y = OK Rule := prune conjunction with pruneset Remove propositions IF ( x 1 > 1 ) AND (x 2 < 3 ) THEN Y = OK If MDL < Best MDL + 64 Add conjunction Else Break RIPPER (Initialization)

9
Ripper (Optimization) Repeat K times For each rule IF ( x 1 > 1 ) AND (x 2 < 3 ) THEN Y = OK Generate revisionrule by adding propositions IF ( x 1 > 1 ) AND (x 2 3 ) THEN Y = OK Generate replacementrule by regrowing IF ( x 1 > 4 ) THEN Y = OK Compare current rule with revisionrule and replacementrule Take the best according to MDL

10
Minimum Description Length Description Length of a Ruleset Description Length of Rules S = ||k|| + k log 2 (n / k) + (n – k) log 2 (k / (n – k)) Description Length of Exceptions S = log 2 (|D| + 1) + fp (-log 2 (e / 2C)) + (C – fp) (-log 2 (1 - e / 2C)) + fn (-log 2 (fn / 2U)) + (U – fn) (-log 2 (1 - fn / 2U))

11
Ripper * Finding best condition is done by trying all possible split points (time-consuming) Shortcut: Linear Discriminant Analysis Split point is calculated analytically To be more robust Instances further than 3 are removed If number of examples < 20, shortcut not used

12
Experiments 29 datasets from UCI repository are used 10 fold cross-validation Comparison done using one-sided t test Comparison of three algorithms C4.5Rules, Ripper, Ripper * Comparison based on Error Rate Complexity of the rulesets Learning Time

13
Error Rate (I) Ripper and its variant have better performance than C4.5Rules Ripper * has similar performance compared to Ripper C4.5Rules has advantage when the number of rules are small (Exhaustive Search)

14
Error Rate (II) C4.5RulesRipperRipper * Total C4.5Rules-477 Ripper12-2 Ripper * 101- Total1259

15
Ruleset Complexity (I) Ripper and Ripper * produce significantly small number of rules compared to C4.5Rules C4.5Rules starts with an unpruned tree, which is a large amount of rule to start

16
Ruleset Complexity (II) C4.5RulesRipperRipper * Total C4.5Rules-111 Ripper Ripper * 272- Total27311

17
Learning Time (I) Ripper * better than Ripper, which is better than C4.5Rules C4.5Rules O(N 3 ) Ripper O(Nlog 2 N) Ripper * O(NlogN)

18
Learning Time (II) C4.5RulesRipperRipper * Total C4.5Rules-202 Ripper23-0 Ripper * Total25130

19
Conclusion Comparison of two rule induction algorithms C4.5Rules and Ripper Proposed a shortcut in learning conditions using LDA (Ripper * ) Ripper is better than C4.5Rules Ripper * improves learning time of Ripper without decreasing its performance

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google