Presentation is loading. Please wait.

Presentation is loading. Please wait.

Statistical Learning Dong Liu Dept. EEIS, USTC.

Similar presentations


Presentation on theme: "Statistical Learning Dong Liu Dept. EEIS, USTC."— Presentation transcript:

1 Statistical Learning Dong Liu Dept. EEIS, USTC

2 Chapter 5. Non-Parametric Supervised Learning
Parzen window k-nearest-neighbor (k-NN) Sparse coding 2018/11/17 Chap 5. Non-Parametric Supervised Learning

3 Non-parametric learning
Most of statistical learning methods assume a model For example, linear regression assumes and linear classification assumes Learning is converted to a problem of solving/estimating model parameters In this chapter, we consider learning without explicit modeling Non-parametric learning is sometimes equivalent to instance/memory-based learning 2018/11/17 Chap 5. Non-Parametric Supervised Learning

4 Chap 5. Non-Parametric Supervised Learning
Parzen window Consider the problem of (probabilistic) density estimation We want to estimate an unknown distribution p(x) from a given set of samples Parametric learning: we assume a Non-parametric learning: we do not presume Parzen window, also known as kernel density estimation Kernel function shall satisfy: non-negative, integrate to 1, e.g. 2018/11/17 Chap 5. Non-Parametric Supervised Learning

5 Comparison between histogram and Parzen window
2018/11/17 Chap 5. Non-Parametric Supervised Learning

6 Parzen window: Window size
Introduce a hyper-parameter to control window size Then Gray: true density Red: h=0.05 Black: h=0.337 Green: h=2 2018/11/17 Chap 5. Non-Parametric Supervised Learning

7 Chapter 5. Non-Parametric Supervised Learning
Parzen window k-nearest-neighbor (k-NN) Sparse coding 2018/11/17 Chap 5. Non-Parametric Supervised Learning

8 Chap 5. Non-Parametric Supervised Learning
Nearest-neighbor If it walks like a duck, quacks like a duck, then it is probably a duck. Compare & find the “closest” sample New sample Samples in memory 2018/11/17 Chap 5. Non-Parametric Supervised Learning

9 Chap 5. Non-Parametric Supervised Learning
Illustration of 1-NN Reproduced from ESL It is sensitive to noise/outlier in dataset. How to improve? 2018/11/17 Chap 5. Non-Parametric Supervised Learning

10 Chap 5. Non-Parametric Supervised Learning
k-nearest-neighbor Compare and find several “closet” samples, and make decision based on them all k-NN classification k-NN regression 2018/11/17 Chap 5. Non-Parametric Supervised Learning

11 Chap 5. Non-Parametric Supervised Learning
Illustration of k-NN Reproduced from ESL Better than 1-NN in eliminating noise effect 2018/11/17 Chap 5. Non-Parametric Supervised Learning

12 Chap 5. Non-Parametric Supervised Learning
Variants of k-NN How to define neighborhood? Fixed k (then how to set?) Adaptive k (e.g. distance thresholding) How to define distance? Euclidean distance Lp distance Mahalanobis distance Cosine similarity Pearson correlation coefficient Distance weighting Equal weight Less distance, more weight 2018/11/17 Chap 5. Non-Parametric Supervised Learning

13 How to set k: Bias-variance tradeoff
Consider k-NN regression Assume the data are generated by Then we have 2018/11/17 Chap 5. Non-Parametric Supervised Learning

14 Example: Rating prediction for making recommendations 1/5
Items Search Recommendations Advertisements Products, news, movies, music, … 2018/11/17 Chap 5. Non-Parametric Supervised Learning

15 Example: Rating prediction for making recommendations 2/5
2018/11/17 Chap 5. Non-Parametric Supervised Learning

16 Example: Rating prediction for making recommendations 3/5
2018/11/17 Chap 5. Non-Parametric Supervised Learning

17 Example: Rating prediction for making recommendations 4/5
2018/11/17 Chap 5. Non-Parametric Supervised Learning

18 Example: Rating prediction for making recommendations 5/5
2018/11/17 Chap 5. Non-Parametric Supervised Learning

19 Chapter 5. Non-Parametric Supervised Learning
Parzen window k-nearest-neighbor (k-NN) Sparse coding 2018/11/17 Chap 5. Non-Parametric Supervised Learning

20 Chap 5. Non-Parametric Supervised Learning
Sparse coding Instead of finding “closest” samples, we want to find “correlated” samples 2018/11/17 Chap 5. Non-Parametric Supervised Learning

21 Chap 5. Non-Parametric Supervised Learning
Sparse coding Sparse coding aims to solve Sparse coding can be relaxed to deal with noise or corruption in data, for example To deal with (Gaussian) noise To deal with (sparse) corruption 2018/11/17 Chap 5. Non-Parametric Supervised Learning

22 Example: Robust face recognition 1/2
Wright, J., Yang, A. Y., Ganesh, A., Sastry, S. S., & Ma, Y. (2009). Robust face recognition via sparse representation. IEEE transactions on pattern analysis and machine intelligence, 31(2), 2018/11/17 Chap 5. Non-Parametric Supervised Learning

23 Example: Robust face recognition 2/2
Able to identify valid/invalid input 2018/11/17 Chap 5. Non-Parametric Supervised Learning

24 Chap 5. Non-Parametric Supervised Learning
Chapter summary Dictionary Toolbox Instance-based learning Memory-based learning k-NN Parzen window Sparse coding 2018/11/17 Chap 5. Non-Parametric Supervised Learning


Download ppt "Statistical Learning Dong Liu Dept. EEIS, USTC."

Similar presentations


Ads by Google