Case Based Reasoning Lecture 4: CBR Tutorial on Decision Trees.

Slides:



Advertisements
Similar presentations
Whether it rains, Whether it pours, Wherever I go,
Advertisements

§ 1.10 Properties of the Real Number System. Angel, Elementary Algebra, 7ed 2 Commutative Property Commutative Property of Addition If a and b represent.
Slide 1 Insert your own content. Slide 2 Insert your own content.
1 Copyright © 2010, Elsevier Inc. All rights Reserved Fig 3.1 Chapter 3.
Quest ion 1? A.) answer B.) answer C.) correct answerD.) answer L C.) correct answer F.
Combining Like Terms. Only combine terms that are exactly the same!! Whats the same mean? –If numbers have a variable, then you can combine only ones.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Multiplication Facts Review. 6 x 4 = 24 5 x 5 = 25.
Multiplying monomials & binomials You will have 20 seconds to answer the following 15 questions. There will be a chime signaling when the questions change.
Exponents You will have 20 seconds to complete each of the following 16 questions. A chime will sound as each slide changes. Read the instructions at.
0 - 0.
2 pt 3 pt 4 pt 5pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt 2pt 3 pt 4pt 5 pt 1pt 2pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4pt 5 pt 1pt Simplify All mixed up Misc. AddingSubtract.
2 pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt Time Money AdditionSubtraction.
DIVIDING INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
MULTIPLYING MONOMIALS TIMES POLYNOMIALS (DISTRIBUTIVE PROPERTY)
ADDING INTEGERS 1. POS. + POS. = POS. 2. NEG. + NEG. = NEG. 3. POS. + NEG. OR NEG. + POS. SUBTRACT TAKE SIGN OF BIGGER ABSOLUTE VALUE.
SUBTRACTING INTEGERS 1. CHANGE THE SUBTRACTION SIGN TO ADDITION
MULT. INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
FACTORING Think Distributive property backwards Work down, Show all steps ax + ay = a(x + y)
FACTORING ax2 + bx + c Think “unfoil” Work down, Show all steps.
Teacher Name Class / Subject Date A:B: Write an answer here #1 Write your question Here C:D: Write an answer here.
Addition Facts
CS4026 Formal Models of Computation Running Haskell Programs – power.
Lecture 3: CBR Case-Base Indexing
Empty Box Problems Subtraction = 3 If you start on 6 and jump back 3 spaces you land on
Complex Numbers Properties & Powers of i
O X Click on Number next to person for a question.
© S Haughton more than 3?
5.9 + = 10 a)3.6 b)4.1 c)5.3 Question 1: Good Answer!! Well Done!! = 10 Question 1:
Molecular Biomedical Informatics 分子生醫資訊實驗室 Machine Learning and Bioinformatics 機器學習與生物資訊學 Machine Learning & Bioinformatics 1.
1 Directed Depth First Search Adjacency Lists A: F G B: A H C: A D D: C F E: C D G F: E: G: : H: B: I: H: F A B C G D E H I.
Squares and Square Root WALK. Solve each problem REVIEW:
Energy & Green Urbanism Markku Lappalainen Aalto University.
Lets play bingo!!. Calculate: MEAN Calculate: MEDIAN
Past Tense Probe. Past Tense Probe Past Tense Probe – Practice 1.
Limits (Algebraic) Calculus Fall, What can we do with limits?
Properties of Exponents
Addition 1’s to 20.
Start. 5a - 5 = 7a - 19 A. 7 B. 16 C. -7 D. None of these.
25 seconds left…...
Bab /44 Bab 4 Classification: Basic Concepts, Decision Trees & Model Evaluation Part 1 Classification With Decision tree Muhamad Arief Hidayat
Test B, 100 Subtraction Facts
Click your mouse to move the card ahead! Work with a buddy using two Abacuses. First click and follow along using your abacus. After each click talk about.
11 = This is the fact family. You say: 8+3=11 and 3+8=11
Week 1.
EOC Practice #19 SPI
Let’s take a 15 minute break Please be back on time.
1 Ke – Kitchen Elements Newport Ave. – Lot 13 Bethesda, MD.
Bottoms Up Factoring. Start with the X-box 3-9 Product Sum
FIND THE AREA ( ROUND TO THE NEAREST TENTHS) 2.7 in 15 in in.
O X Click on Number next to person for a question.
X-box Factoring. X- Box 3-9 Product Sum Factor the x-box way Example: Factor 3x 2 -13x (3)(-10)= x 2x 3x 2 x-5 3x +2.
Decision Trees.
Decision Trees an Introduction.
Review. 2 Statistical modeling  “Opposite” of 1R: use all the attributes  Two assumptions: Attributes are  equally important  statistically independent.
K Nearest Neighbor Classification Methods Qiang Yang.
K Nearest Neighbor Classification Methods Qiang Yang.
5. Machine Learning ENEE 759D | ENEE 459D | CMSC 858Z
Classification II. 2 Numeric Attributes Numeric attributes can take many values –Creating branches for each value is not ideal The value range is usually.
Classification I. 2 The Task Input: Collection of instances with a set of attributes x and a special nominal attribute Y called class attribute Output:
Decision-Tree Induction & Decision-Rule Induction
 Classification 1. 2  Task: Given a set of pre-classified examples, build a model or classifier to classify new cases.  Supervised learning: classes.
Slide 1 DSCI 4520/5240: Data Mining Fall 2013 – Dr. Nick Evangelopoulos Lecture 5: Decision Tree Algorithms Material based on: Witten & Frank 2000, Olson.
Example: input data outlooktemp.humiditywindyplay sunnyhothighfalseno sunnyhothightrueno overcasthothighfalseyes rainymildhighfalseyes rainycoolnormalfalseyes.
K Nearest Neighbor Classification Methods. Training Set.
Data Management and Database Technologies 1 DATA MINING Extracting Knowledge From Data Petr Olmer CERN
Data Mining Chapter 4 Algorithms: The Basic Methods Reporter: Yuen-Kuei Hsueh.
Data and its Distribution. The popular table  Table (relation)  propositional, attribute-value  Example  record, row, instance, case  Table represents.
© 2013 ExcelR Solutions. All Rights Reserved Data Mining - Supervised Decision Tree & Random Forest.
Presentation transcript:

Case Based Reasoning Lecture 4: CBR Tutorial on Decision Trees

Exercise 1 – Predicting the Risk for Loan Applications IncomeCredit RatingDebtCollateralRisk 1$0 to $15kbadhighnonehigh 2$15 to $35kunknownhighnonehigh 3$15 to $35kunknownlownonemoderate 4$0 to $15kunknownlownonehigh 5over $35kunknownlownonelow 6over $35kunknownlowadequatelow 7$0 to $15kbadlownonehigh 8over $35kbadlowadequatemoderate 9over $35kgoodlownonelow 10over $35kgoodhighadequatelow 11$0 to $15kgoodhighnonehigh 12$15 to $35kgoodhighnonemoderate 13over $35kgoodhighnonelow 14$15 to $35kbadhighnonehigh Create the decision tree index

Exercise 1 – Solution first part 6/14 cases are high risk, 3/14 cases are moderate risk, and 5/14 cases are low risk Entropy(S)= -6/14*log 2 6/14 - 3/14*log 2 3/14 - 5/14*log 2 5/14 = If income is the root of the decision tree, then: Entropy(0-15k) = - 4/4 * log 2 4/4 = 0 Entropy(15-35k)= - 2/4 * log 2 2/4 - 2/4 * log 2 2/4 = 1 Entropy(over 35k) = - 5/6 * log 2 5/6 - 1/6 * log 2 1/6 = 0.65 Expectation(income)= 4/14*Entropy(0-15) + 4/14*Entropy(15-35) + 6/14*Entropy(over 35k) = 0.56 Gain(income)=Entropy(S)–Expectation(income)= – 0.56 = 0.971

Exercise 2 – Predicting the Desirability of a Property PriceLocationStateDesirability 1AverageCentralOKYes 2HighCountrysideOKNo 3LowCentralGoodYes 4HighCentralGoodYes 5AverageCountrysideOKYes 6AverageCentralBadNo 7LowCountrysideBadYes Create the decision tree index

Exercise 3 – Predicting whether to Play Tennis or Not Create the decision tree index OutlookTemperatureHumidityWindyPlay SunnyHotHighFalseNo SunnyHotHighTrueNo CloudyHotHighFalseYes RainyMildHighFalseYes RainyCoolNormalFalseYes RainyCoolNormalTrueNo CloudyCoolNormalTrueYes SunnyMildHighFalseNo SunnyCoolNormalFalseYes RainyMildNormalFalseYes SunnyMildNormalTrueYes CloudyMildHighTrueYes CloudyHotNormalFalseYes RainyMildHighTrueNo