Presentation is loading. Please wait.

Presentation is loading. Please wait.

Additive Models , Trees , and Related Models Prof. Liqing Zhang Dept. Computer Science & Engineering, Shanghai Jiaotong University.

Similar presentations


Presentation on theme: "Additive Models , Trees , and Related Models Prof. Liqing Zhang Dept. Computer Science & Engineering, Shanghai Jiaotong University."— Presentation transcript:

1 Additive Models , Trees , and Related Models Prof. Liqing Zhang Dept. Computer Science & Engineering, Shanghai Jiaotong University

2 Introduction 9.1: Generalized Additive Models 9.2: Tree-Based Methods 9.3: PRIM:Bump Hunting 9.4: MARS: Multivariate Adaptive Regression Splines 9.5: HME:Hieraechical Mixture of Experts

3 9.1 Generalized Additive Models In the regression setting, a generalized additive models has the form: Here s are unspecified smooth and nonparametric functions. Instead of using LBE(Linear Basis Expansion) in chapter 5, we fit each function using a scatter plot smoother(e.g. a cubic smoothing spline)

4 GAM(cont.) For two-class classification, the additive logistic regression model is: Here

5 GAM(cont) In general, the conditional mean U(x) of a response y is related to an additive function of the predictors via a link function g: Examples of classical link functions: Identity: g(u)=u Logit: g(u)=log[u/(1-u)] Probit: g(    Log: g(u) = log(u)

6 Fitting Additive Models The additive model has the form: Here we have Given observations, a criterion like penalized sum squares can be specified for this problem: Where are tuning parameters.

7 FAM(cont.) Conclusions: The solution to minimize PRSS is cubic splines, however without further restrictions the solution is not unique. If 0 holds, it is easy to see that : If in addition to this restriction, the matrix of input values has full column rank, then (9.7) is a strict convex criterion and has an unique solution. If the matrix is singular, then the linear part of fj cannot be uniquely determined. (Buja 1989)

8 Learning GAM: Backfitting Backfitting algorithm 1.Initialize: 2.Cycle: j = 1,2,…, p,…,1,2,…, p,…, (m cycles) Until the functions change less than a prespecified threshold

9 Backfitting: Points to Ponder Computational Advantage? Convergence? How to choose fitting functions?

10 FAM(cont.)

11 Additive Logistic Regression 2016/2/111

12 Logistic Regression Model the class posterior in terms of K-1 log-odds Decision boundary is set of points Linear discriminant function for class k Classify to the class with the largest value for its  k (x)

13 Logistic Regression con’t Parameters estimation –Objective function –Parameters estimation IRLS (iteratively reweighted least squares) Particularly, for two-class case, using Newton-Raphson algorithm to solve the equation, the objective function:

14 Logistic Regression con’t

15

16

17

18 When it is used –binary responses (two classes) –As a data analysis and inference tool to understand the role of the input variables in explaining the outcome Feature selection –Find a subset of the variables that are sufficient for explaining their joint effect on the response. –One way is to repeatedly drop the least significant coefficient, and refit the model until no further terms can be dropped –Another strategy is to refit each model with one variable removed, and perform an analysis of deviance to decide which one variable to exclude Regularization –Maximum penalized likelihood –Shrinking the parameters via an L 1 constraint, imposing a margin constraint in the separable case

19 Additive Logistic Regression 2016/2/119

20 Additive Logistic Regression

21 Fitting logistic regressionFitting additive logistic regression 1. 2. Iterate: Using weighted least squares to fit a linear model to z i with weights w i, give new estimates 3. Continue step 2 until converge 1. where 2. Iterate: b. a. c.c. Using weighted backfitting algorithm to fit an additive model to z i with weights w i, give new estimates b. 3.Continue step 2 until converge Additive Logistic Regression: Backfitting

22 SPAM Detection via Additive Logistic Regression Input variables (predictors): –48 quantitative variables: percentage of words in the email that match a given word. Examples include business, address, internet, etc. –6 quantitative variables: percentage of characters in the email that match a given character, such as ‘ch;’, ch(, etc. –The average length of uninterrupted sequences of capital letters –The length of the longest uninterrupted sequence of capital letters –The sum of length of uninterrupted length of capital letters Output variable: SPAM (1) or Email (0) f j ’s are taken as cubic smoothing splines

23 2016/2/1Additive Models23

24 2016/2/1Additive Models24

25 SPAM Detection: Results True ClassPredicted Class Email (0)SPAM (1) Email (0)58.5%2.5% SPAM (1)2.7%36.2% Sensitivity: Probability of predicting spam given true state is spam = Specificity: Probability of predicting email given true state is email =

26 GAM: Summary Useful flexible extensions of linear models Backfitting algorithm is simple and modular Interpretability of the predictors (input variables) are not obscured Not suitable for very large data mining applications (why?)


Download ppt "Additive Models , Trees , and Related Models Prof. Liqing Zhang Dept. Computer Science & Engineering, Shanghai Jiaotong University."

Similar presentations


Ads by Google