Friday’s Deliverable As a GROUP, you need to bring 2N+1 copies of your “initial submission” –This paper should be a complete version of your paper – something.

Slides:



Advertisements
Similar presentations
Learning from Observations Chapter 18 Section 1 – 3.
Advertisements

1 Machine Learning: Lecture 3 Decision Tree Learning (Based on Chapter 3 of Mitchell T.., Machine Learning, 1997)
Decision Tree Learning - ID3
Decision Trees Decision tree representation ID3 learning algorithm
ICS320-Foundations of Adaptive and Learning Systems
Classification Techniques: Decision Tree Learning
Machine Learning Group University College Dublin Decision Trees What is a Decision Tree? How to build a good one…
Decision Tree Rong Jin. Determine Milage Per Gallon.
Knowledge Representation. 2 Outline: Output - Knowledge representation  Decision tables  Decision trees  Decision rules  Rules involving relations.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Machine Learning II Decision Tree Induction CSE 473.
Part 7.3 Decision Trees Decision tree representation ID3 learning algorithm Entropy, information gain Overfitting.
CS 590M Fall 2001: Security Issues in Data Mining Lecture 4: ID3.
Università di Milano-Bicocca Laurea Magistrale in Informatica Corso di APPRENDIMENTO E APPROSSIMAZIONE Prof. Giancarlo Mauri Lezione 3 - Learning Decision.
Decision Trees Decision tree representation Top Down Construction
Ch 3. Decision Tree Learning
MACHINE LEARNING. What is learning? A computer program learns if it improves its performance at some task through experience (T. Mitchell, 1997) A computer.
Machine Learning Lecture 10 Decision Trees G53MLE Machine Learning Dr Guoping Qiu1.
Decision Tree Learning
Machine Learning Chapter 3. Decision Tree Learning
Data Mining Joyeeta Dutta-Moscato July 10, Wherever we have large amounts of data, we have the need for building systems capable of learning information.
INTRODUCTION TO MACHINE LEARNING. $1,000,000 Machine Learning  Learn models from data  Three main types of learning :  Supervised learning  Unsupervised.
Inductive learning Simplest form: learn a function from examples
Artificial Intelligence 7. Decision trees
Mohammad Ali Keyvanrad
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
Machine Learning Lecture 10 Decision Tree Learning 1.
CpSc 810: Machine Learning Decision Tree Learning.
Learning from Observations Chapter 18 Through
CHAPTER 18 SECTION 1 – 3 Learning from Observations.
1 Knowledge Discovery Transparencies prepared by Ho Tu Bao [JAIST] ITCS 6162.
Learning from Observations Chapter 18 Section 1 – 3, 5-8 (presentation TBC)
Artificial Intelligence Project #3 : Analysis of Decision Tree Learning Using WEKA May 23, 2006.
1 Machine Learning 1.Where does machine learning fit in computer science? 2.What is machine learning? 3.Where can machine learning be applied? 4.Should.
Learning from observations
CS690L Data Mining: Classification
CS 8751 ML & KDDDecision Trees1 Decision tree representation ID3 learning algorithm Entropy, Information gain Overfitting.
CS 5751 Machine Learning Chapter 3 Decision Tree Learning1 Decision Trees Decision tree representation ID3 learning algorithm Entropy, Information gain.
Midterm 3 Revision and ID3 Prof. Sin-Min Lee. Armstrong’s Axioms We can find F+ by applying Armstrong’s Axioms: –if   , then    (reflexivity) –if.
Decision Tree Learning
Training Examples. Entropy and Information Gain Information answers questions The more clueless I am about the answer initially, the more information.
Decision Tree Learning Presented by Ping Zhang Nov. 26th, 2007.
Seminar on Machine Learning Rada Mihalcea Decision Trees Very short intro to Weka January 27, 2003.
Chapter 18 Section 1 – 3 Learning from Observations.
Outline Decision tree representation ID3 learning algorithm Entropy, Information gain Issues in decision tree learning 2.
Learning From Observations Inductive Learning Decision Trees Ensembles.
DATA MINING TECHNIQUES (DECISION TREES ) Presented by: Shweta Ghate MIT College OF Engineering.
Decision Tree Learning DA514 - Lecture Slides 2 Modified and expanded from: E. Alpaydin-ML (chapter 9) T. Mitchell-ML.
Review of Decision Tree Learning Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
Learning from Observations
Learning from Observations
Machine Learning Inductive Learning and Decision Trees
Università di Milano-Bicocca Laurea Magistrale in Informatica
Decision Tree Learning
Decision trees (concept learnig)
Machine Learning Lecture 2: Decision Tree Learning.
Introduce to machine learning
Decision trees (concept learnig)
Prepared by: Mahmoud Rafeek Al-Farra
Presented By S.Yamuna AP/CSE
Decision Tree Saed Sayad 9/21/2018.
Machine Learning Chapter 3. Decision Tree Learning
Decision Trees Decision tree representation ID3 learning algorithm
Machine Learning Chapter 3. Decision Tree Learning
Learning from Observations
Decision Trees Decision tree representation ID3 learning algorithm
AI and Machine Learning
Learning from Observations
Decision trees One possible representation for hypotheses
A task of induction to find patterns
Presentation transcript:

Friday’s Deliverable As a GROUP, you need to bring 2N+1 copies of your “initial submission” –This paper should be a complete version of your paper – something you would be willing to turn in for your final grade in no worse than a worst-case scenario Also bring another timecard –PLEASE read the instructions

When do Schafer and East play golf?

Learning Definition: A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience.

Machine Learning Models Classification Regression Clustering Time series analysis Association Analysis Sequence Discovery ….

Classification example Weight Height o x x x x x x x x x x x o o o o o o o o o x oo x x x - weight-lifters o - ballet dancers Features: height, weight

Classification example - Simple Model Weight Height o x x x x x x x x x x x o o o o o o o o o x oo x x x - weight-lifters o - ballet dancers Decision boundary Features: height, weight

Classification example - Complex model Weight Height o x x x x x x x x x x x o o o o o o o o o x oo x x x - weight-lifters o - ballet dancers Complex Decision boundary Features: height, weight Note: A simple decision boundary is better than a complex one - It GENERALIZES better.

Classification example Model Test set Train set Learning system New data Loan Yes/No

Learning Paradigms Supervised learning - with teacher inputs and correct outputs are provided by the teacher Reinforced learning - with reward or punishment an action is evaluated Unsupervised learning - with no teacher no hint about correct output is given

Supervised Learning Activity

Supervised Learning This IS a “Frinkle”

Supervised Learning This IS a “Frinkle”

Supervised Learning This IS NOT a “Frinkle”

Supervised Learning This IS NOT a “Frinkle”

Supervised Learning Is this a “Frinkle”??

Supervised Learning

Machine Learning Methods Artificial Neural Networks Decision Trees Instance Based Methods (CBR, k-NN) Bayesian Networks Evolutionary Strategies Support Vector Machines..

Machine Learning Methods Artificial Neural Networks Decision Trees Instance Based Methods (CBR, k-NN) Bayesian Networks Evolutionary Strategies Support Vector Machines..

When do Schafer and East play golf?

Decision Tree for PlayGolf Outlook SunnyOvercastRain Humidity HighNormal Wind StrongWeak NoYes No

Decision Tree for PlayGolf Outlook SunnyOvercastRain Humidity HighNormal NoYes Each node tests an attribute Each branch corresponds to an attribute value Each leaf node assigns a classification

No Decision Tree for PlayGolf Outlook SunnyOvercastRain Humidity HighNormal Wind StrongWeak NoYes No Outlook Temperature Humidity Wind PlayGolf Sunny Hot High Weak ?

Decision Tree for Conjunction Outlook SunnyOvercastRain Wind StrongWeak NoYes No Outlook=Sunny  Wind=Weak No

Decision Tree for Disjunction Outlook SunnyOvercastRain Yes Outlook=Sunny  Wind=Weak Wind StrongWeak NoYes Wind StrongWeak NoYes

Decision Tree for XOR Outlook SunnyOvercastRain Wind StrongWeak YesNo Outlook=Sunny XOR Wind=Weak Wind StrongWeak NoYes Wind StrongWeak NoYes

Decision Tree Outlook SunnyOvercastRain Humidity HighNormal Wind StrongWeak NoYes No decision trees represent disjunctions of conjunctions (Outlook=Sunny  Humidity=Normal)  (Outlook=Overcast)  (Outlook=Rain  Wind=Weak)

Expressiveness of Decision Trees Decision trees can express ANY function of the input attributes. Trivially, there exists a decision tree for any consistent training set (one path to a leaf for each example).

When to consider Decision Trees Instances describable by attribute-value pairs Target function is discrete valued Disjunctive hypothesis may be required Possibly noisy training data Missing attribute values Examples: –Medical diagnosis –Credit risk analysis –Object classification for robot manipulator (Tan 1993)

Expressiveness of Decision Trees Decision trees can express ANY function of the input attributes. Trivially, there exists a decision tree for any consistent training set (one path to a leaf for each example). –But it probably won’t generalize to new examples. –We prefer to find more compact decision trees.

Think about it… Which tree would you rather use… AB CD CD Y F E F E F E F YYYYNYN EF AB NY E Y

Decision tree learning How do you select a small tree consistent with the training examples? Idea: (recursively) choose the “most significant” attribute as root of (sub)tree.