Presentation is loading. Please wait.

Presentation is loading. Please wait.

1/31/08CS 461, Winter 20091 CS 461: Machine Learning Lecture 4 Dr. Kiri Wagstaff Dr. Kiri Wagstaff

Similar presentations


Presentation on theme: "1/31/08CS 461, Winter 20091 CS 461: Machine Learning Lecture 4 Dr. Kiri Wagstaff Dr. Kiri Wagstaff"— Presentation transcript:

1 1/31/08CS 461, Winter 20091 CS 461: Machine Learning Lecture 4 Dr. Kiri Wagstaff wkiri@wkiri.com Dr. Kiri Wagstaff wkiri@wkiri.com

2 1/31/08CS 461, Winter 20092 Plan for Today Solution to HW 2 Support Vector Machines Neural Networks Perceptrons Multilayer Perceptrons

3 1/31/08CS 461, Winter 20093 Review from Lecture 3 Decision trees Regression trees, pruning, extracting rules Evaluation Comparing two classifiers: McNemars test Support Vector Machines Classification Linear discriminants, maximum margin Learning (optimization): gradient descent, QP

4 1/31/08CS 461, Winter 20094 Neural Networks Chapter 11 It Is Pitch Dark Chapter 11 It Is Pitch Dark

5 1/31/08CS 461, Winter 20095 Perceptron [Alpaydin 2004 The MIT Press] GraphicalMath

6 1/31/08CS 461, Winter 20096 Smooth Output: Sigmoid Function Why? Converts output to probability! Less brittle boundary

7 1/31/08CS 461, Winter 20097 K outputs Regression : [Alpaydin 2004 The MIT Press] Softmax Classification :

8 1/31/08CS 461, Winter 20098 Training a Neural Network 1.Randomly initialize weights 2.Update = Learning rate * (Desired - Actual) * Input

9 1/31/08CS 461, Winter 20099 Learning Boolean AND [Alpaydin 2004 The MIT Press] Perceptron demo

10 1/31/08CS 461, Winter 200910 Multilayer Perceptrons = MLP = ANN [Alpaydin 2004 The MIT Press]

11 1/31/08CS 461, Winter 200911 x 1 XOR x 2 = (x 1 AND ~x 2 ) OR (~x 1 AND x 2 ) [Alpaydin 2004 The MIT Press]

12 1/31/08CS 461, Winter 200912 Examples Digit Recognition Ball Balancing

13 1/31/08CS 461, Winter 200913 ANN vs. SVM SVM with sigmoid kernel = 2-layer MLP Parameters ANN: # hidden layers, # nodes SVM: kernel, kernel params, C Optimization ANN: local minimum (gradient descent) SVM: global minimum (QP) Interpretability? About the same… So why SVMs? Sparse solution, geometric interpretation, less likely to overfit data

14 1/31/08CS 461, Winter 200914 Summary: Key Points for Today Support Vector Machines Neural Networks Perceptrons Sigmoid Training by gradient descent Multilayer Perceptrons ANN vs. SVM

15 1/31/08CS 461, Winter 200915 Next Time Midterm Exam! 9:10 – 10:40 a.m. Open book, open notes (no computer) Covers all material through today Neural Networks (read Ch. 11.1-11.8) Questions to answer from the reading Posted on the website (calendar) Three volunteers?


Download ppt "1/31/08CS 461, Winter 20091 CS 461: Machine Learning Lecture 4 Dr. Kiri Wagstaff Dr. Kiri Wagstaff"

Similar presentations


Ads by Google