Download presentation

Presentation is loading. Please wait.

Published byYvonne Nile Modified over 3 years ago

1
Computational Learning An intuitive approach

2
Human Learning Objects in world –Learning by exploration and who knows? Language –informal training, inputs may be incorrect Programming –A couple of examples of loops or recursions Medicine –See one, do one, teach one People: few complex examples, informal, complex behavioral output

3
Computational Learning Representation provided Simple inputs: vectors of values Simple outputs: e.g. yes or no, a number, a disease Many examples (thousands to millions) Quantifiable + Useful, e.g. automatic generation of expert systems

4
Concerns Generalization accuracy –Performance on unseen data –Evaluation Noise and overfitting Biases of representation –You only find what you look for.

5
Three Learning Problems Classification: from known examples create decision procedure to guess class –Patient data -> guess disease Regression: from known examples create decision procedure to guess real numbers –Stock data -> guess price Clustering: putting data into “meaningful” groups –Patient Data -> new diseases

6
Simple data attribute-value representation = 1 example Sex, age, smoker, etc are the attributes Values are male, 50, true etc Only data of this form allowed.

7
The Data: squares and circles ? ? ? ?

8
Linear Decision Boundaries Perceptron

9
Learning a (hyper)-line Given data Construct line – the decision boundary –Usually defined by a normal n – data is on one side if dot product data * n >0 –Recall * is x1*y1+x2*y2. What a neuron does

10
SVM: Optimization approach middle line

11
Decision Trees: rectangular boundaries

12
1-Nearest Neighbor Piecewise Linear boundaries

13
1-Nearest Neighbor classification If x is a example, find the nearest neighbor NN in the data using euclidean distance. Guess the class of c is the class of NN K-nearest neighbor: let the k-nearest neighbors vote Renamed as IB-k in Weka

14
Neural Nets Smooth boundaries

15
Neural Net A single perceptron can’t learn some simple concepts, like XOR A multilayered network of perceptrons can learn any boolean function Learning is not biological but follows from multivariable calculus

16
Gedanken experiments Try ML algorithms on imagined data Ex. Concept: x>y, ie. Data looks like 3,1,+. 2,4,-. etc Which algorithms do best? And how well? Consider the boundaries. My guesses: SMO> Perceptron>NearestN>DT.

17
Check Guesses with Weka 199 examples. DT= 92.9 (called J48 in weka) NN= 97.5 (called IB1 in weka) SVM = 99.0 (called SMO in weka)

Similar presentations

OK

November 2, 2010Neural Networks Lecture 14: Radial Basis Functions 1 Cascade Correlation Weights to each new hidden node are trained to maximize the covariance.

November 2, 2010Neural Networks Lecture 14: Radial Basis Functions 1 Cascade Correlation Weights to each new hidden node are trained to maximize the covariance.

© 2018 SlidePlayer.com Inc.

All rights reserved.

To ensure the functioning of the site, we use **cookies**. We share information about your activities on the site with our partners and Google partners: social networks and companies engaged in advertising and web analytics. For more information, see the Privacy Policy and Google Privacy & Terms.
Your consent to our cookies if you continue to use this website.

Ads by Google

Convert 2007 to 2003 ppt online Ppt on index numbers statistics Ppt on data handling for grade 2 Ppt on history of world wide web Ppt on c++ basic programing Ppt on combination of resistances eve Ppt on taj mahal pollution Ppt on abraham lincoln biography Ppt on obesity diet chart Ppt on republic day of india 2012