Presentation is loading. Please wait.

Presentation is loading. Please wait.

Fuzzy Decision Trees Professor J. F. Baldwin. Classification and Prediction For classification the universe for the target attribute is a discrete set.

Similar presentations


Presentation on theme: "Fuzzy Decision Trees Professor J. F. Baldwin. Classification and Prediction For classification the universe for the target attribute is a discrete set."— Presentation transcript:

1 Fuzzy Decision Trees Professor J. F. Baldwin

2 Classification and Prediction For classification the universe for the target attribute is a discrete set. For prediction the universe for the target attribute is continuous For Prediction use fuzzy partition abcde Arrange fuzzy sets so there are equal number of training data points in each of intervals [a, b], [b, c], [c, d], [d, e] f1f1 f2f2 f3f3 f4f4 f5f5 T

3 Target translation for prediction A 1 A 2 …A n TPr a 11 a 12 …a 1n t 1 p 1 a 11 a 12 …a 1n f i p 1 fi (t 1 ) a 11 a 12 …a 1n f i+1 p 1 fi+1 (t 1 ) Training Set Tr Repeat for each row collecting equivalent rows and adding probabilities Translated Training set Tr' This is now a classification Set.

4 Preparing one attribute reduced database for continuous attribute AiAi TPr From Tr' if prediction Tr if classification Continuous attribute abcde g1g1 g2g2 g3g3 g4g4 g5g5 AiAi equal number of data points in each interval AiAi TPr From Tr' gigi Reduced database choose number of fuzzy sets

5 Fuzzy ID3 Using the training set Tr' and one attribute reduced database for all continuous attributes, we can use the method of ID3 previously given to determine the decision tree for predicting or classifying the target and also post pruning We modify the stopping condition. Do not expand node N if for that node is < some value v Node N will have probability distribution {g i : i } You can also limit the depth of the tree to some value. For example expand tree to depth 4.

6 Evaluating new case for classification AiAi g1g1 g2g2 gngn Attribute value, for continuous attribute will have probability distribution over {g i }. Only 2 non-zero probabilities New case will propagate through many branches of the tree arriving at node N i with probability i determined by multiplying the probabilities of all branches to arrive at N i Let distributions for leaf nodes be N j : {t i : ij } Overall distribution is Decision: choose t k where

7 Evaluating new case for prediction AiAi g1g1 g2g2 gngn Attribute value, for continuous attribute will have probability distribution over {g i }. Only 2 non-zero probabilities New case will propagate through many branches of the tree arriving at node N i with probability i determined by multiplying the probabilities of all branches to arrive at N i Let distributions for leaf nodes be N j : {f i : ij } Overall distribution is

8 Fuzzy Sets important for Data Mining profit income outgoing0 1 0 1 Partition each universe with {small, large} large small large 94.14% correct Two crisp sets on each universe can give at most only 50%accuracy We would require 16 crisp sets on each universe to give same accuracy as a two fuzzy set partition

9 Ellipse Example legal illegal -1.51.5 -1.5 1.5 X Y X, Y universes each partitioned into 5 fuzzy sets about_-1.5 = [-1.5:1, -0.75: 0] about_-0.75 = [-1.5:0, -0.75:1, 0:0] about_0 = [-0.75:0, 0:1, 0.75:0] about_0.75 = [0:0, 0.75:1, 1.5:0] about_1.5 = [0.75:0, 1.5: 1] tree learnt on 126 random points from [-1.5,1.5] 2

10 Tree for Ellipse example

11 General Fril Rule ((Classification = legal) if ( ((X is about_-1.5)) ((X is about_-0.75)& (Y is about_-1.5)) ((X is about_-0.75) & (Y is about_-0.75)) ((X is about_-0.75) & (Y is about_0)) ((X is about_-0.75) & (Y is about_0.75)) ((X is about_-0.75) & (Y is about_1.5)) ((X is about_0) & (Y is about_-1.5)) ((X is about_0) & (Y is about_-0.75)) ((X is about_0) & (Y is about_0)) ((X is about_0) & (Y is about_0.75)) ((X is about_0) & (Y is about_1.5)) ((X is about_0.75) & (Y is about_-1.5)) ((X is about_0.75) & (Y is about_-0.75)) ((X is about_0.75) & (Y is about_0)) ((X is about_0.75) & (Y is about_0.75)) ((X is about_0.75) & (Y is about_1.5)) ((X is about_1.5)))) : ((0 0)(0.0092 0.0092)(0.3506 0.3506) (0.5090 0.5090)(0.3455 0.3455)(0.0131 0.0131) (0.1352 0.1352)(0.8131 0.8131)(1 1) (0.8178 0.8178)(0.1327 0.1327)(0.0109 0.0109) (0.3629 0.3629)(0.5090 0.5090)(0.3455 0.3455) (0.0131 0. 0131)(0 0))

12 Results The above tree was tested on 960 points forming a regular grid on [-1.5,1.5] 2 giving 99.168% correct classification. The control surface for the positive quadrant

13 Iris Classification Data 3 classes - Iris-Setosa,Iris-Versicolor and Iris-Virginica - 50 instances of each class Attributes 1. sepal length in cm ----universe [4.3, 7.9] 2. sepal width in cm ----universe [2, 4.4] 3. petal length in cm ----universe [1, 6.9] 4. petal width in cm ----universe [0.1, 2.5] Fuzzy partition of 5 fuzzy sets on each universe

14 Iris Decision tree Gives 98.667% accuracy on test data

15 Diabetes in Pima Indians Data 768 over 21 yrs females - 384 training, 384 test classes - Attributes 1 Number of times pregnant 2 Plasma glucose concentration 3 Diastolic blood pressure 4 Triceps skin fold thickness 5 2-Hour serum insulin 6 Body mass index 7 Diabetes pedigree function 8 Age Diabetes mellitus in the Pima Indian population living near Phoenix Arizona - 5 fuzzy sets used for each attribute The decision tree was generated to a maximum depth of 4 given a tree of 161 branches. This gave an accuracy of 81.25% on the training set and 79.9% on the test set. Forward pruning algorithm the tree complexity is halved to 80 branches. This reduced tree gives an accuracy of 80.46% on the training set and 78.38% on the test set. Post pruning reduces the complexity to 28 branches giving 78.125% on the training set and 78.9% on the test set

16 Diabetes Tree

17 SIN XY Prediction Example database consists of 528 triples (X, Y, sin XY) where the pairs (X, Y) form a regular grid on [0, 3] 2 about_ 0 = [0:1 0.333333:0 ] about_0.3333 = [0:0 0.333333:1 0.666667:0] about_ 0.6667 = [0.333333:0 0.666667:1 1:0] about _ 1 = [0.666667:0 1:1 1.33333:0] about_ 1.333 = [1:0 1.33333:1 1.66667:0] about_1.667 = [1.33333:0 1.66667:1 2:0] about _ 2 = [1.66667:0 2:1 2.33333:0] about _2.333 = [2:0 2.33333:1 2.66667:0] about _ 2.6667 = [2.33333:0 2.66667:1 3:0] about _ 3 = [2.66667:0 3:1 ] class_ 1 = [-1:1 0:0] class _2 = [-1:0 0:1 0.380647:0] class_ 3 = [0:0 0.380647:1 0.822602:0] class_4 = [0.380647:0 0.822602:1 1:0] class_5 = [0.822602:0 1:1]

18 Fuzzy ID3 decision tree with 100 branches Percentage error of 4.22% on a regular test set of 1023 points.


Download ppt "Fuzzy Decision Trees Professor J. F. Baldwin. Classification and Prediction For classification the universe for the target attribute is a discrete set."

Similar presentations


Ads by Google