Presentation is loading. Please wait.

Presentation is loading. Please wait.

Ensemble Methods.  “No free lunch theorem” Wolpert and Macready 1995.

Similar presentations


Presentation on theme: "Ensemble Methods.  “No free lunch theorem” Wolpert and Macready 1995."— Presentation transcript:

1 Ensemble Methods

2  “No free lunch theorem” Wolpert and Macready 1995

3  “No free lunch theorem” Wolpert and Macready 1995  Solution search also involves searching for learners

4  Different algorithms

5  Different parameters

6  Different algorithms  Different parameters  Different input representations/features

7  Different algorithms  Different parameters  Different input representations/features  Different data

8  Base learner

9  Diversity over accuracy

10  Model combination

11  Voting  Bagging  Boosting  Cascading

12

13

14

15  Data set = [1,2,3,4,5,6,7,8,9,10]  Samples:  Input to learner 1 = [10,2,5,10,3]  Input to learner 2 = [4,5,2,7,6,3]  Input to learner 3 = [8,8,4,9,1]

16  Create complementary learners

17  Train successive learners on the mistakes of predecessors

18  Weak learners combine to a strong learner

19

20

21

22

23  Adaboost – Adaptive Boosting

24  Allows for a smaller training set

25  Adaboost – Adaptive Boosting  Allows for a smaller training set  Simple classifiers

26  Adaboost – Adaptive Boosting  Allows for a smaller training set  Simple classifiers  Binary

27  Modify probability of drawing examples from a training set based on errors

28

29

30

31

32 Step 3

33

34

35

36  Demo

37  Sequence classifiers by complexity

38  Use classifier j+1 if classifier j doesn’t meet a confidence threshold

39  Sequence classifiers by complexity  Use classifier j+1 if classifier j doesn’t meet a confidence threshold  Train cascading classifiers on instances the previous classifier is not confident about

40  Sequence classifiers by complexity  Use classifier j+1 if classifier j doesn’t meet a confidence threshold  Train cascading classifiers on instances the previous classifier is not confident about  Most examples classified quickly, harder ones passed to more expensive classifiers

41  Boosting and Cascading

42

43

44

45

46

47  Object detection/tracking  Collaborative filtering  Neural networks  Optical character recognition ++  Biometrics  Data mining

48  Ensemble methods are proven effective, but why?


Download ppt "Ensemble Methods.  “No free lunch theorem” Wolpert and Macready 1995."

Similar presentations


Ads by Google