Presentation is loading. Please wait.

Presentation is loading. Please wait.

Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

Similar presentations


Presentation on theme: "Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is."— Presentation transcript:

1 Neural Network Computing Lecture no.1

2 All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is binary. McCullogh-Pitts Neurons are connected by directed, weighted paths. Each neuron has a fixed threshold.

3 All rights reserved L. Manevitz Lecture 13 Architecture

4 All rights reserved L. Manevitz Lecture 14 Theorem  We can model any function or phenomenon that can be represented as a logic function.  First step we’ll show that the neuron can perform a simple logic function as AND, OR and NOT.  At the second step we’ll use these simple neurons as building blocks. (Recall representability of logic functions by DNF form).

5 All rights reserved L. Manevitz Lecture 15 AND 00 10 01 11 0 0 0 1 0 0 0*1+0*1=0 0<1.5 0 1 0*1+1*1=1 1<1.5 1 0 1*1+0*1=1 1<1.5 1 1 1*1+1*1=2 2>1.5

6 All rights reserved L. Manevitz Lecture 16 OR 00 10 01 11 0 1 1 1 0 0 0*1+0*1=0 0<0.9 0 1 0*1+1*1=1 1>0.9 1 0 1*1+0*1=1 1>0.9 1 1 1*1+1*1=2 2>0.9

7 All rights reserved L. Manevitz Lecture 17 NOT 1*-1=-1 -1<-0.5 0*-1=0 0>-0.5 0 0 1 1 0 1

8 All rights reserved L. Manevitz Lecture 18 DNF DNF form :

9 All rights reserved L. Manevitz Lecture 19 Biases and Thresholds We can replace the threshold with a bias. A bias acts exactly as a weight on a connection from a unit whose activation is always 1.

10 All rights reserved L. Manevitz Lecture 110 Perceptron Loop : Take an example and apply to network. If correct answer – return to Loop. If incorrect – go to Fix. Fix : Adjust network weights by input example. Go to Loop.

11 All rights reserved L. Manevitz Lecture 111 Perceptron Algorithm Let be arbitrary Choose: choose Test: If and go to Choose If and go to Fix plus If and go to Choose If and go to Fix minus Fix plus: go to Choose Fix minus: go to Choose

12 All rights reserved L. Manevitz Lecture 112 Perceptron Algorithm Conditions to the algorithm existence : Condition no.1: Condition no.2: We choose F to be a group of unit vectors.

13 All rights reserved L. Manevitz Lecture 113 Geometric viewpoint

14 All rights reserved L. Manevitz Lecture 114 Perceptron Algorithm Based on these conditions the number of times we enter the Loop is finite. Proof: Positive examples Negative examples Examples world

15 All rights reserved L. Manevitz Lecture 115 Perceptron Algorithm-Proof We replace the threshold with a bias. We assume F is a group of unit vectors.

16 All rights reserved L. Manevitz Lecture 116 Perceptron Algorithm-Proof We reduce what we have to prove by eliminating all the negative examples and placing their negations in the positive examples.

17 All rights reserved L. Manevitz Lecture 117 Perceptron Algorithm-Proof The numerator : After n changes 

18 All rights reserved L. Manevitz Lecture 118 Perceptron Algorithm-Proof The denominator : After n changes 

19 All rights reserved L. Manevitz Lecture 119 Perceptron Algorithm-Proof From the numerator : From the denominator : n is final

20 All rights reserved L. Manevitz Lecture 120 Example - AND 100 110 101 111 AND 0 0 0 1 bias  wrong etc…

21 All rights reserved L. Manevitz Lecture 121 AND – Bi Polar solution 01 01 1 011 0111 01 11 1 111 1111 111 11  wrong + - - 020 01 01 1 011 1111 continue success

22 All rights reserved L. Manevitz Lecture 122 Problem should be small enough so that contradiction !!!

23 All rights reserved L. Manevitz Lecture 123 Linear Separation Every perceptron determines a classification of vector inputs which is determined by a hyperline Two dimensional examples (add algebra) OR ANDXOR not possible

24 All rights reserved L. Manevitz Lecture 124 Linear Separation in Higher Dimensions In higher dimensions, still linear separation, but hard to tell Example: Connected; Convex - which can be handled by Perceptron with local sensors; which can not be. Note: Define local sensors.


Download ppt "Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is."

Similar presentations


Ads by Google