Presentation is loading. Please wait.

Presentation is loading. Please wait.

Adaboost and its application

Similar presentations


Presentation on theme: "Adaboost and its application"— Presentation transcript:

1 Adaboost and its application
training一群classifier的演算法,當我們有待分類物件的database,讓各個classifier去認識這個物件,找出分類的法則,最後再把他們的意見統整起來。

2 Outline Introduction Adaboost algorithm Application
Two class and training error Multi-class Application Face detection 用例子帶出adaboost這類分類法的概念。

3 Outline Introduction Adaboost algorithm Application
Two class and training error Multi-class Application Face detection

4 Introduction The horse-track problem: Use historical horse-race data.
Odds Dry or muddy Jockey The horse-track problem: Use historical horse-race data. Derive some rules of thumb. Predict the winner. Favorable odds Muddy lightest one 賽馬的賭徒,希望能更有依據的猜測冠軍。收集資料,歸納出一些經驗法則,然後視當天的狀況,套用這些法則,決定要下賭注的馬。

5 Introduction How to choose horse-race data?
Random? Resample data for classifier design. How to combine rules of thumb into single decision? Equal importance? Combine the results of multiple “weak” classifiers into a single “strong” classifier.

6 Introduction Two most popular: Assume two class to classify : class_1
Bagging ( Breiman,1994 ) Boosting Adaboost ( Freund and Schapire,1996) Assume two class to classify : class_2 class_1

7 Bagging Input data :x Training data

8 Boosting Input data :x Training data

9 Bagging v.s Boosting Bagging: Boosting: Sample: - equal weight.
Weak classifier combination : Boosting: -unequal weight. Weak classifier combination: - unequal weight.

10 Outline Introduction Adaboost algorithm Application
Two class and training error Multi-class Application Face detection

11 A Formal View of Boosting
Given training set correct label of instance For Construct distribution on Find weak hypothesis( “rule of thumb”) with small error on Output final hypothesis

12 Adaboost Concept Adaboost starts with a uniform distribution of “weights” over training examples. The weights tell the learning algorithm the importance of the example. Obtain a weak classifier from the weak learning algorithm, hj(x). Increase the weights on the training examples that were misclassified. (Repeat) At the end, carefully make a linear combination of the weak classifiers obtained at all iterations.

13 Adaboost Constructing : Final hypothesis: Given and :
where = normalized constant, Final hypothesis:

14 Adaboost (Example)

15 Adaboost (Example)

16 Adaboost (Example)

17 Adaboost (Example)

18 Adaboost (Example)

19 Advantages of Adaboost
Weight update focus more on “hard” samples. ( misclassified in the previous iterations) Simple and easy to program. No parameter to tune( except T). Can combine with many classifiers to find weak hypothesis: Neural network, decision trees, nearest-neighbor classifiers….. Classifier的分類法不會一直被已經分類正確的sample所影響,例如:橄欖球隊員,壯且高,我們先用體重去分類,把用體重分類正確的那些人 的影響降低,在考慮身高。

20 Training Error Let , then training error So if then training error
Error bound和每一次的錯誤率有關,假設某一個分類器的error很小,可以彌補其他分類不好的分類器。 T越大,越多分類器,理論上training error 可以趨近於零。

21 Multi-class Problem Adaboost.MH e.g: Reduce to binary problems.
Possible labels are {a,b,c,d,e} Each training sample replaced by five {-1,+1} labeled sample.

22 Adaboost.MH Formally:

23 Outline Introduction Adaboost algorithm Application
Two class and training error Multi-class Application Face detection

24 Face Detection Non-face Adaboost Detection result Training set

25 Classifiers Design Haar-like features for : Two-rectangle (A,B)
Three-rectangle (C) Four-rectangle (D) 24 24

26 Classifiers Design Why use Haar-like features?
Resolution of detector : 24*24 total 160,000 (quite large)

27 Classifiers Design Use “Integral image”. Feature computation:

28 Classifier Design Choose the best features Adaptive reweighting
Non-face Training set Haar-like features 自動挑選適合的feature.使我們可以得到更efficient 的feature。

29 Face Detection Computation cost: Ex: image size: 320x240.
sub-window size:24x24. frame rate: 15 frame/sec. each feature need ( )x( )x15=966,735 per sec (if ignore scaling) huge computation cost !!

30 Face Detection Use cascade classifiers. Example:
200 feature classifier  featureclassifiers.

31 Face Detection Advantage of cascade classifiers: Maintain accuracy.
Speed up.

32 Experiments


Download ppt "Adaboost and its application"

Similar presentations


Ads by Google