Presentation is loading. Please wait.

Presentation is loading. Please wait.

Rule Induction with Extension Matrices Dr. Xindong Wu Journal of the American Society for Information Science VOL. 49, NO. 5, 1998 Presented by Peter Duval.

Similar presentations


Presentation on theme: "Rule Induction with Extension Matrices Dr. Xindong Wu Journal of the American Society for Information Science VOL. 49, NO. 5, 1998 Presented by Peter Duval."— Presentation transcript:

1 Rule Induction with Extension Matrices Dr. Xindong Wu Journal of the American Society for Information Science VOL. 49, NO. 5, 1998 Presented by Peter Duval S3 Elimination Strategy EM path Inevitable Selector HCV NEM PE NE EMD HFL Eliminable Selector Redundant Selector MFL MCV S1 Fast Strategy S2 Precedence Strategy S4 Least Frequency Selector Persistent Intersecting Group

2 Context HFL/HCV presents an alternative to decision trees with rule induction. HCV can be used as a benchmark for rule induction.

3 Context This paper condenses Dr. Wu’s Ph.D. dissertation on the extension matrix approach and HFL/HCV algorithm. Look to University of Illinois, J. Wong and R.S. Michalski for work leading to HFL/HCV. HFL/HCV appears to be underrepresented in literature citations.

4 Overview 1.Represent the negative training data as row vectors in a matrix. 2.Process positive examples as they come to eliminate uninformative attributes in the negative examples. 3.Read conjunctive rules from the resulting matrix. 4.Simplify and cleanup the rules.

5 A positive example (PE): Positive and Negative Examples A negative example (NE):

6 Negative example matrix (NEM) Gather the negative examples as row vectors in the NEM:

7 Positive Example (PE) Against NEM A positive example written as a row vector:

8 Extension Matrices Delete any matching elements in the NEM

9 Extension Matrices We construct one Extension Matrix per Positive Example

10 Extension Matrices Let’s make a second extension matrix:

11 Extension Matrices The second extension matrix:

12 Extension Matrices Finally let’s make a third extension matrix:

13 Extension Matrices The third extension matrix:

14 Dead Elements Dead Elements, *, take the place of attributes that fail to distinguish the negative example from the corresponding positive example.

15 Matrix Disjunction (EMD) If there exists a dead element in any position of the extension matrices, the EMD will have a dead element there, too. “OR” the dead elements

16 Partitions Once a dead row would be created, start a new EMD. Partition 1Partition 2 …

17 Matrix Disjunction (EMD) Let’s construct the EMD using just the first two Extension Matrices. “OR” the dead elements

18 Matrix Disjunction (EMD) The EMD has dramatically reduced the amount of superfluous information. “OR” the dead elements

19 Paths Choose one non-dead element from each row. This is called a path. We can create paths in EMs and EMDs.

20 Path  Cover ≡ Conjunctive Formula The path corresponds to a conjuctive formula expressed in variable-valued logic.

21 Path = Cover ≡ Conjunctive Formula

22 HFL Wu developed HFL to find good rules. An algorithm with 4 strategies, it finds a compact disjunction of conjunctions: 1.Fast 2.Precedence 3.Elimination 4.Least Frequency

23 HFL Strategies: Fast X3≠1 covers all negative examples. X3≠1 => positive class. We can stop processing.

24 HFL Strategies: Precedence [X1≠1] and [X3≠1] are inevitable selectors. Record conjunction and label the rows as covered. Below, a path is formed. All rows are covered. We are done.

25 HFL Strategies: Elimination Redundant selectors in attribute X2 can be eliminated because non-dead X3 values cover all of the rows covered by X2. All elements in column X2 become dead elements. 

26 HFL Strategies: Least Frequency Attribute X1 selectors are least frequent and can be eliminated. Other strategies must be applied before applying Least Frequency again. 

27 HCV Algorithm HCV improves HFL: 1.Partition the positive examples into intersecting groups. 2.Apply HFL on each partition 3.OR the conjunctive formulae from each partition. Well described in: http://www.cs.uvm.edu/~xwu/Publication/JASIS.ps See Wu’s 1993 Ph.D dissertation for more background: http://www.era.lib.ed.ac.uk/bitstream/1842/581/3/1993- xindongw.pdf

28 HCV Software Features many refinements and switches Works with C4.5 data. Can be run through a web interface: HCV Online Interface HCV Online Interface Is described in Appendix A of Wu’s textbook, and online: HCV Manual HCV Manual

29 Golf Rules for the 'Play' class (Covering 3 examples): The 1st conjunctive rule: [ temperature != { cool } ] ^ [ outlook != { sunny } ] --> the 'Play' class (Positive examples covered: 3) Rules for the 'Don't_Play' class (Covering 4 examples): The 2nd conjunctive rule: [ outlook != { overcast } ] ^ [ wind = { windy } ] --> the 'Don't_Play' class (Positive examples covered: 4) The total number of conjunctive rules is: 2 The default class is: 'Don't_Play' (Examples in class: 4) Time taken for induction (seconds): 0.0 (real), 0.0 (user), 0.0 (system) Rule file or preprocessed test file not found. Skipping deduction

30 HCV HCV is competitive with other decision tree and rule producing algorithms. HCV generally produces more compact rules. HCV outputs variable-valued logic. HCV handles noise and discretization. HCV guarantees a “conjunctive rule for a concept”.

31 Ideas Can HFL/HCV be applied to chess? Bratko did this with ID3. [Crevier 1993, 177] How can HCV be parallelized? How does the extension matrix approach work in closed-world situations? Is HCV 2.0 a good candidate for automated parameter tuning by genetic algorithm or other evolutionary technique?

32 The End. Presentation based on slides by Leslie Damon. Questions?

33 Exam Questions Definitions: Extension Matrix: a matrix of negative examples as row vectors, where, for a given positive example, elements that match the positive example are replaced with dead elements, denoted as ‘*’. Dead Element: an element of a negative example which cannot be used to distinguish a given positive example from the negative example. Path: a set of non-dead elements, one each from all of the rows of an extension matrix.

34 Exam Questions Four stages of HFL: 1.Fast: A single attribute value that covers all rows 2.Precedence: Favor attributes that are the only non- dead element of a row. 3.Elimination: Get rid of redundant elements. 4.Least Frequency: Get rid of columns that cover where non-dead values cover the fewest rows. See slides labeled “HFL Strategies”

35 Exam Questions The Pneumonia/Tuberculosis problem is worked through in the paper and Leslie Damon’s slides. Here is the EMD:

36 Attribute-based induction algorithms Concentrates on heuristic symbolic and computations Doesn’t require built in knowledge ID3-like algorithms most famous low order polynomial in time and space


Download ppt "Rule Induction with Extension Matrices Dr. Xindong Wu Journal of the American Society for Information Science VOL. 49, NO. 5, 1998 Presented by Peter Duval."

Similar presentations


Ads by Google