Presentation is loading. Please wait.

Presentation is loading. Please wait.

Friday’s Deliverable As a GROUP, you need to bring 2N+1 copies of your “initial submission” –This paper should be a complete version of your paper – something.

Similar presentations


Presentation on theme: "Friday’s Deliverable As a GROUP, you need to bring 2N+1 copies of your “initial submission” –This paper should be a complete version of your paper – something."— Presentation transcript:

1 Friday’s Deliverable As a GROUP, you need to bring 2N+1 copies of your “initial submission” –This paper should be a complete version of your paper – something you would be willing to turn in for your final grade in no worse than a worst-case scenario Also bring another timecard –PLEASE read the instructions

2 When do Schafer and East play golf?

3 Learning Definition: A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience.

4 Machine Learning Models Classification Regression Clustering Time series analysis Association Analysis Sequence Discovery ….

5 Classification example Weight Height o x x x x x x x x x x x o o o o o o o o o x oo x x x - weight-lifters o - ballet dancers Features: height, weight

6 Classification example - Simple Model Weight Height o x x x x x x x x x x x o o o o o o o o o x oo x x x - weight-lifters o - ballet dancers Decision boundary Features: height, weight

7 Classification example - Complex model Weight Height o x x x x x x x x x x x o o o o o o o o o x oo x x x - weight-lifters o - ballet dancers Complex Decision boundary Features: height, weight Note: A simple decision boundary is better than a complex one - It GENERALIZES better.

8 Classification example Model Test set Train set Learning system New data Loan Yes/No

9 Learning Paradigms Supervised learning - with teacher inputs and correct outputs are provided by the teacher Reinforced learning - with reward or punishment an action is evaluated Unsupervised learning - with no teacher no hint about correct output is given

10 Supervised Learning Activity

11 Supervised Learning This IS a “Frinkle”

12 Supervised Learning This IS a “Frinkle”

13 Supervised Learning This IS NOT a “Frinkle”

14 Supervised Learning This IS NOT a “Frinkle”

15 Supervised Learning Is this a “Frinkle”??

16 Supervised Learning

17

18

19 Machine Learning Methods Artificial Neural Networks Decision Trees Instance Based Methods (CBR, k-NN) Bayesian Networks Evolutionary Strategies Support Vector Machines..

20 Machine Learning Methods Artificial Neural Networks Decision Trees Instance Based Methods (CBR, k-NN) Bayesian Networks Evolutionary Strategies Support Vector Machines..

21 When do Schafer and East play golf?

22 Decision Tree for PlayGolf Outlook SunnyOvercastRain Humidity HighNormal Wind StrongWeak NoYes No

23 Decision Tree for PlayGolf Outlook SunnyOvercastRain Humidity HighNormal NoYes Each node tests an attribute Each branch corresponds to an attribute value Each leaf node assigns a classification

24 No Decision Tree for PlayGolf Outlook SunnyOvercastRain Humidity HighNormal Wind StrongWeak NoYes No Outlook Temperature Humidity Wind PlayGolf Sunny Hot High Weak ?

25 Decision Tree for Conjunction Outlook SunnyOvercastRain Wind StrongWeak NoYes No Outlook=Sunny  Wind=Weak No

26 Decision Tree for Disjunction Outlook SunnyOvercastRain Yes Outlook=Sunny  Wind=Weak Wind StrongWeak NoYes Wind StrongWeak NoYes

27 Decision Tree for XOR Outlook SunnyOvercastRain Wind StrongWeak YesNo Outlook=Sunny XOR Wind=Weak Wind StrongWeak NoYes Wind StrongWeak NoYes

28 Decision Tree Outlook SunnyOvercastRain Humidity HighNormal Wind StrongWeak NoYes No decision trees represent disjunctions of conjunctions (Outlook=Sunny  Humidity=Normal)  (Outlook=Overcast)  (Outlook=Rain  Wind=Weak)

29 Expressiveness of Decision Trees Decision trees can express ANY function of the input attributes. Trivially, there exists a decision tree for any consistent training set (one path to a leaf for each example).

30 When to consider Decision Trees Instances describable by attribute-value pairs Target function is discrete valued Disjunctive hypothesis may be required Possibly noisy training data Missing attribute values Examples: –Medical diagnosis –Credit risk analysis –Object classification for robot manipulator (Tan 1993)

31 Expressiveness of Decision Trees Decision trees can express ANY function of the input attributes. Trivially, there exists a decision tree for any consistent training set (one path to a leaf for each example). –But it probably won’t generalize to new examples. –We prefer to find more compact decision trees.

32 Think about it… Which tree would you rather use… AB CD CD Y F E F E F E F YYYYNYN EF AB NY E Y

33 Decision tree learning How do you select a small tree consistent with the training examples? Idea: (recursively) choose the “most significant” attribute as root of (sub)tree.


Download ppt "Friday’s Deliverable As a GROUP, you need to bring 2N+1 copies of your “initial submission” –This paper should be a complete version of your paper – something."

Similar presentations


Ads by Google