Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sanguthevar Rajasekaran University of Connecticut

Similar presentations


Presentation on theme: "Sanguthevar Rajasekaran University of Connecticut"— Presentation transcript:

1 Sanguthevar Rajasekaran University of Connecticut
Machine Learning Sanguthevar Rajasekaran University of Connecticut

2 Machine Learning

3 Machine Learning

4 Machine Learning In practice we may not know the form of the function and also there could be errors. In practice we guess the form of the function. Each such possible function will be called a model and characterized by some parameters. We choose parameter values that will minimize the difference between the model outputs & the true function values. There are two kinds of learning: supervised & unsupervised. In supervised learning we are given a set of examples (x, y) and the goal is to infer the predicting conditional probability distribution P(y| x). In unsupervised learning the goal is to predict a data generating distribution P(x) after observing many random vectors x from this distribution.

5 Machine Learning A machine learning algorithm will also be tested on training as well as previously unseen on data points. Thus we have training error and testing error. A model is said to underfit if its training error is not low enough. A model is said to overfit if the difference between the training and test error is very large. In this case the model memorizes the properties of the training data closely. We can modify the underfitting and overfitting behavior of a learning algorithm by changing the capacity of the model.

6 Neural Networks 1 Neural networks are learning paradigms inspired by the human brain. Brain consists of millions of neurons interconnected by a complex network. Even though each neuron is limited in power, the collection can produce impressive results.

7 Neural Networks 2 A neural network is a connected leveled graph. Each node is a processor. (Directed) edges correspond to communication links. In this leveled graph there will be at least two levels (one for input and another for output). There could be more levels (called hidden levels).

8 Neural Networks 3 Each edge in the network has an associated weight and each node has a threshold. Consider a two-level network. Let N be any node in the second level (i.e., the output level). Let q be the number of incoming edges into N and let x1, x2, …, xq be the corresponding input values. Let the weights on these incoming edges be w1, w2, …, wq. Denote the threshold value of N as T. Consider the case where each output is binary. If w1.x1+w2.x2+…+wq.xq>T, then the node N will output one possible value; if not, it will output the other possible value.

9 Neural Networks 4

10 Neural Networks 5 There are patterns that a perceptron cannot learn.
For example there is no perceptron corresponding to the XOR function. (XOR is a binary function defined as follows. If x and y are Boolean, then x XOR y is 1 if and only if exactly one of x and y is one; x XOR y is zero otherwise).

11 Neural Networks 6 For appropriate values of the node thresholds and edge weights a neural network can represent an arbitrarily complex function or concept. A neural network can be used to learn a given concept by training it with a number of examples. An example is nothing but a pair of input and the corresponding output values.

12 Neural Networks 7 A General ANN looks like:

13 Neural Networks 8

14 Neural Networks 9

15 Neural Networks 10

16 Neural Networks 11


Download ppt "Sanguthevar Rajasekaran University of Connecticut"

Similar presentations


Ads by Google