Presentation is loading. Please wait.

Presentation is loading. Please wait.

Neural Networks (NN) Part 1 1.NN: Basic Ideas 2.Computational Principles 3.Examples of Neural Computation.

Similar presentations


Presentation on theme: "Neural Networks (NN) Part 1 1.NN: Basic Ideas 2.Computational Principles 3.Examples of Neural Computation."— Presentation transcript:

1

2 Neural Networks (NN) Part 1 1.NN: Basic Ideas 2.Computational Principles 3.Examples of Neural Computation

3 Neural network: An example 1. NN: Basic Ideas

4 A neural network has been shown to be a universal Turing machine; it can compute anything that is computable (Siegelmann & Sontag, 1991). Further, if equipped with appropriate algorithms, the neural network can be made into an intelligent computing machine that solves problems in finite time. Such algorithms have been developed in recent years: –Backpropagation learning rule (1985) –Hebbian learning rule (1949) –Kohonen’s self-organizing feature map (1982) –Hopfield nets (1982) –Boltzman machine (1986) Biological neural network (BNN) vs artificial neural network (ANN)

5 McCulloch-Pitts Networks (1943) - Model of artificial neurons that computes Boolean logical functions where Y, X 1, X 2 take on binary values of 0 or 1, and W 1, W 2, Q take on continuous values (e.g.)for W1 = 0.3, W2 = 0.5, W0 = 0.6 X1X2 Y 000 010 100 111 Boolean AND Computation

6 McCulloch & Pitts (1943, Fig. 1)

7 Knowledge representation in M-P network Hiring Rule #1: “A job candidate who has either good grade or prior job experience and also gets strong letters and receives positive mark in interview tends to make a desirable employee and therefore should be hired.” “Knowledge is in the connection weights (and the threshold).”

8 Learning in M-P network Hiring Rule #1A: “A job candidate who has either good grade or prior job experience and also gets strong letters or receives positive mark in interview tends to make a desirable employee and therefore should be hired.” “Learning through weight modification (e.g., Hebb rule).”

9 Acquisition of new knowledge in M-P network Hiring Rule #2: “In addition to Rule 1A, a job candidate who has no prior job experience and receives negative mark in interview shouldn’t be hired.” “Acquisition of new knowledge through creation of new neurons (i.e., synaptogenesis).”

10 1. Distributed representation 2. Later inhibition 3. Bi-directional interaction 4. Error-correction learning 5. Hebbian learning 2. Computational Principles

11 1. Distributed Representation (vs. localist representation) An object is represented by multiple units; the same unit participates in multiple representations:

12 Why distributed representation? 1. Efficiency Solve the combinatorial explosion problem: With n binary units, 2 n different representations possible. (e.g.) How many English words from a combination of 26 alphabet letters? 2, Robustness (fault-tolerance) Loss of one or a few units does not affect representation. (e.g., holographic image, Fourier transformation).

13 2. Lateral Inhibition Selective activation through activation-dependent inhibitory competition (i.e., WTA).

14 3. Bi-directional Interaction (interactive, recurrent connections) This top-down-and-bottom-up processing computes constraint- optimization, and is generally faster than the uni-directional computation (e.g., word recognition).

15 Bi-directional interactions in Speech Perception

16 4. Error-correction Learning (e.g., backpropagation) dW ik =e*A i *(T k - B k )

17 A three-layer feed forward network with backpropagation learning can approximates any measurable function to any desired degree of accuracy by increasing the number of hidden units (Hornik et al, 1989, 1990; Hecht-Nielsen, 1989). However, biological plausibility of backpropagation learning is yet to be confirmed.

18 dW ik =e B k (A i - W ik ) 5. Hebbian Learning (unsupervised/self-organizing)

19 Hebbian Rule Encodes Correlational Information Asymptotically, W ik = P(A i = ‘fire |B k =‘fire’ ) In other words, the weight W ik stores information about correlation (i.e., co-firing activities) between the input and output units. Q1: How about encoding the other half of the correlational information, that is, P(B k = ‘fire|A i =‘fire’) Q2: Discuss implications of an anti-Hebbian learning rule such as dW ik =- e A i (B k - W ik )

20 Biological Plausibility of Hebbian Learning The neurophysiological plausibility is well documented (e.g., Levy & Stewards, 1980), including sub-cellular mechanisms (NMDA-mediated long-term potentiation (LTP)).

21 Noise-tolerant memory Pattern completion Content-addressable memory Language learning Sensory motor control Visual perception Speech perception Word recognition ….. 3. Examples of Neural Computation

22

23 NETtalk (program that learns to pronounce English words)

24 Submarine Sonar


Download ppt "Neural Networks (NN) Part 1 1.NN: Basic Ideas 2.Computational Principles 3.Examples of Neural Computation."

Similar presentations


Ads by Google