Decision Trees IDHairHeightWeightLotionResult SarahBlondeAverageLightNoSunburn DanaBlondeTallAverageYesnone AlexBrownTallAverageYesNone AnnieBlondeShortAverageNoSunburn.

Slides:



Advertisements
Similar presentations
DECISION TREES. Decision trees  One possible representation for hypotheses.
Advertisements

1 Machine Learning: Lecture 3 Decision Tree Learning (Based on Chapter 3 of Mitchell T.., Machine Learning, 1997)
Decision Trees Decision tree representation ID3 learning algorithm
1er. Escuela Red ProTIC - Tandil, de Abril, Decision Tree Learning 3.1 Introduction –Method for approximation of discrete-valued target functions.
Data Mining Techniques: Classification. Classification What is Classification? –Classifying tuples in a database –In training set E each tuple consists.
Pavan J Joshi 2010MCS2095 Special Topics in Database Systems
Introduction Training Complexity, Pruning CART vs. ID3 vs. C4.5
Evolutionary Computing Systems Lab (ECSL), University of Nevada, Reno 1.
ICS320-Foundations of Adaptive and Learning Systems
Decision Tree Learning 主講人:虞台文 大同大學資工所 智慧型多媒體研究室.
Chapter 7 – Classification and Regression Trees
Decision Tree Learning
Lecture outline Classification Decision-tree classification.
ID3 Algorithm Abbas Rizvi CS157 B Spring What is the ID3 algorithm? ID3 stands for Iterative Dichotomiser 3 Algorithm used to generate a decision.
Part 7.3 Decision Trees Decision tree representation ID3 learning algorithm Entropy, information gain Overfitting.
Decision Tree Algorithm
CS 590M Fall 2001: Security Issues in Data Mining Lecture 4: ID3.
Decision tree LING 572 Fei Xia 1/10/06. Outline Basic concepts Main issues Advanced topics.
Decision Tree Learning Learning Decision Trees (Mitchell 1997, Russell & Norvig 2003) –Decision tree induction is a simple but powerful learning paradigm.
Decision Trees Decision tree representation Top Down Construction
Decision Trees IDHairHeightWeightLotionResult SarahBlondeAverageLightNoSunburn DanaBlondeTallAverageYesnone AlexBrownTallAverageYesNone AnnieBlondeShortAverageNoSunburn.
Example of a Decision Tree categorical continuous class Splitting Attributes Refund Yes No NO MarSt Single, Divorced Married TaxInc NO < 80K > 80K.
Learning….in a rather broad sense: improvement of performance on the basis of experience Machine learning…… improve for task T with respect to performance.
Decision Tree Learning
Decision tree LING 572 Fei Xia 1/16/06.
By Wang Rui State Key Lab of CAD&CG
Fall 2004 TDIDT Learning CS478 - Machine Learning.
Machine Learning Chapter 3. Decision Tree Learning
Decision Trees Advanced Statistical Methods in NLP Ling572 January 10, 2012.
Learning what questions to ask. 8/29/03Decision Trees2  Job is to build a tree that represents a series of questions that the classifier will ask of.
Mohammad Ali Keyvanrad
Computational Intelligence: Methods and Applications Lecture 19 Pruning of decision trees Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
CpSc 810: Machine Learning Decision Tree Learning.
Decision Trees DefinitionDefinition MechanismMechanism Splitting FunctionSplitting Function Issues in Decision-Tree LearningIssues in Decision-Tree Learning.
For Wednesday No reading Homework: –Chapter 18, exercise 6.
CS690L Data Mining: Classification
For Monday No new reading Homework: –Chapter 18, exercises 3 and 4.
CS 8751 ML & KDDDecision Trees1 Decision tree representation ID3 learning algorithm Entropy, Information gain Overfitting.
Decision Trees. What is a decision tree? Input = assignment of values for given attributes –Discrete (often Boolean) or continuous Output = predicated.
1 Universidad de Buenos Aires Maestría en Data Mining y Knowledge Discovery Aprendizaje Automático 5-Inducción de árboles de decisión (2/2) Eduardo Poggi.
CS 5751 Machine Learning Chapter 3 Decision Tree Learning1 Decision Trees Decision tree representation ID3 learning algorithm Entropy, Information gain.
机器学习 陈昱 北京大学计算机科学技术研究所 信息安全工程研究中心. 课程基本信息  主讲教师:陈昱 Tel :  助教:程再兴, Tel :  课程网页:
Decision Trees Binary output – easily extendible to multiple output classes. Takes a set of attributes for a given situation or object and outputs a yes/no.
1 Decision Tree Learning Original slides by Raymond J. Mooney University of Texas at Austin.
Decision Trees, Part 1 Reading: Textbook, Chapter 6.
Decision Tree Learning
Decision Tree Learning Presented by Ping Zhang Nov. 26th, 2007.
Decision Trees IDHairHeightWeightLotionResult SarahBlondeAverageLightNoSunburn DanaBlondeTallAverageYesnone AlexBrownTallAverageYesNone AnnieBlondeShortAverageNoSunburn.
ID3 Algorithm Amrit Gurung. Classification Library System Organise according to special characteristics Faster retrieval New items sorted easily Related.
ECE 471/571 – Lecture 20 Decision Tree 11/19/15. 2 Nominal Data Descriptions that are discrete and without any natural notion of similarity or even ordering.
Outline Decision tree representation ID3 learning algorithm Entropy, Information gain Issues in decision tree learning 2.
Decision Tree Learning DA514 - Lecture Slides 2 Modified and expanded from: E. Alpaydin-ML (chapter 9) T. Mitchell-ML.
Machine Learning Inductive Learning and Decision Trees
DECISION TREES An internal node represents a test on an attribute.
CS 9633 Machine Learning Decision Tree Learning
Decision Tree Learning
Decision Tree Learning
C4.5 algorithm Let the classes be denoted {C1, C2,…, Ck}. There are three possibilities for the content of the set of training samples T in the given node.
C4.5 algorithm Let the classes be denoted {C1, C2,…, Ck}. There are three possibilities for the content of the set of training samples T in the given node.
Ch9: Decision Trees 9.1 Introduction A decision tree:
Data Science Algorithms: The Basic Methods
Issues in Decision-Tree Learning Avoiding overfitting through pruning
Decision Tree Saed Sayad 9/21/2018.
Introduction to Data Mining, 2nd Edition by
Introduction to Data Mining, 2nd Edition by
Introduction to Data Mining, 2nd Edition by
Machine Learning Chapter 3. Decision Tree Learning
Machine Learning: Lecture 3
Machine Learning Chapter 3. Decision Tree Learning
Decision Trees Jeff Storey.
Presentation transcript:

Decision Trees IDHairHeightWeightLotionResult SarahBlondeAverageLightNoSunburn DanaBlondeTallAverageYesnone AlexBrownTallAverageYesNone AnnieBlondeShortAverageNoSunburn EmilyRedAverageHeavyNoSunburn PeteBrownTallHeavyNoNone JohnBrownAverageHeavyNoNone KatieBlondeShortLightYesNone

Example

Example 2

Examples, which one is better?

Good when  Samples are attribute-value pairs  Target function has discrete output values  Disjunctions required  Missing, noisy training data

Construction  Top down construction 1.Which attribute should be tested to form the root of a tree? 2.Create branches for each attribute value and sort samples into these branches 3.At each branch node, repeat 1

So how do we choose attribute? Prefer smaller trees Occam's razor for DTs The world is inherently simple. Therefore the smallest decision tree that is consistent with the samples is once that is most likely to identify unknown objects correctly

How can you construct smallest Maximize homogeneity in each branch

After choosing hair color

Formally Maximize homogeneity = Minimize Disorder Disorder formula can be taken from information theory

Entropy

Entropy intuition An attribute can have two values. If equal numbers of both values then

Entropy intuition (2) An attribute can have two values. If ONLY one value present

Entropy intuition (3)

Entropy intuition (4)

Decision Trees IDHairHeightWeightLotionResult SarahBlondeAverageLightNoSunburn DanaBlondeTallAverageYesnone AlexBrownTallAverageYesNone AnnieBlondeShortAverageNoSunburn EmilyRedAverageHeavyNoSunburn PeteBrownTallHeavyNoNone JohnBrownAverageHeavyNoNone KatieBlondeShortLightYesNone

Worked Example, hair color

Other tests

Issues in DT learning Over fitting the data Given a learned tree t with error e, if there is an alternate tree t' with error e' that fits the data and e' > e on the training set, but t' has smaller error over the entire distribution of samples

Overfitting

Dealing with overfitting Stop growing the tree Post prune the tree after overfitting How do we determine correct final tree (size) Validation set (1/3rd) Statistical (chi-square) test to determine whether to grow the tree Minimize MDL (measure of complexity) size(tree) + size(misclassifications)

Reduced error pruning Remove subtree at node and replace with leaf Assign most common class at node to leaf Only select node for removal if error <= error of original tree on validation set

Effect of Reduced-Error Pruning

Rule post pruning Convert tree to equivalent set of rules Prune each rule independently of others Remove precondition and test Sort final rules into sequence by estimated accuracy and consider them in this sequence

Why rules then pruning? Each path through a node produces a different rule so you have many rules per node that can be pruned versus removing one node (and subtree) In rules, tests near the root do not mean more than tests near leaves Rules are often easier to read and understand

Continuous valued attributes

Continuous to discrete We want a threshold (binary attribute) that produces the greatest information gain. Sort attribute Identify adjacent examples that differ in class Candidate thresholds are midway between attribute value on these examples Check candidate thresholds for information gain and choose the one that maximizes gain (or equivalently minimizes entropy)

Continuous attributes are favored ID3 prefers many valued attributes Consider Name: Perfect classification Also include how well (broadly and uniformly) an attribute helps to split data Name not broad at all Lotion used: much better

Attributes with Costs We want lower cost attributes tested earlier Multiply by cost?