Presentation is loading. Please wait.

Presentation is loading. Please wait.

Importance Weighted Active Learning

Similar presentations


Presentation on theme: "Importance Weighted Active Learning"— Presentation transcript:

1 Importance Weighted Active Learning
by Alina Beygelzimer, Sanjoy Dasgupta and John Langford (ICML 2009) Presented by Lingbo Li ECE, Duke University September 25, 2009

2 Outline Introduction The Importance Weighting Skeleton
Setting the rejection threshold Label Complexity Implementing IWAL Conclusion

3 Introduction Active learning
At each step t, a learner receives an unlabeled point , and decide whether to query its label Hypothesis space is , where Z is prediction space. Loss function Drawback from earlier work: not consistent PAC-convergence guarantee active learning 1) only 0-1 loss function; 2) internal use of generalization bounds. Importance weighted approach 1) non-adaptive; 2) asymptotic. Motivation Using importance weighting to build a consistent binary classifier under general loss functions, which removes sampling bias and improves label complexity.

4 The Importance Weighting Skeleton
The expected loss The importance weighted estimate of the loss at time T then IWAL algorithms are consistent, if does not equal zero.

5 Setting the rejection threshold
To do the minimization over instead of where IWAL performs never worse than supervised learning.

6 Label Complexity – upper bound
Previous work of active learning has been done only on the 0-1 loss with the number of queries of ; For arbitrary loss functions with the similar conditions, the number of queries is

7 Label Complexity – lower bound
Lower bound is increased.

8 Implementing IWAL (1) linear separators; logistic loss;
MNIST data set of handwritten digits with 3’s and 5’s as two classes; 1000 exemplars for training; another 1000 for testing; Use PCA to reduce dimensions; Optimistic bound of Active learning performs similar to supervised learning with only less than 1/3 of the labels queried.

9 Implementing IWAL (2) bootstrapping scheme
binary and multiclass classification loss MNIST dataset

10 Conclusion IWAL is a consistent algorithm, which can be implemented with flexible losses. Label complexity is theoretical provided with substantial improvement. Practical experiments approve this.


Download ppt "Importance Weighted Active Learning"

Similar presentations


Ads by Google