Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately 10 11 neurons, each connected to, on average, 10 4.

Similar presentations


Presentation on theme: "1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately 10 11 neurons, each connected to, on average, 10 4."— Presentation transcript:

1 1 Azhari, Dr Computer Science UGM

2 Human brain is a densely interconnected network of approximately 10 11 neurons, each connected to, on average, 10 4 others. Neuron activity is excited or inhibited through connections to other neurons. The fastest neuron switching times are known to be on the order of 10 -3 sec. 2

3 Gross physical structure: –There is one axon that branches –There is a dendritic tree that collects input from other neurons Axons typically contact dendritic trees at synapses –A spike of activity in the axon causes charge to be injected into the post- synaptic neuron Spike generation: –There is an axon hillock that generates outgoing spikes whenever enough charge has flowed in at synapses to depolarize the cell membrane axon body dendritic tree

4 4 The cell itself includes a nucleus (at the center). To the right of cell 2, the dendrites provide input signals to the cell. To the right of cell 1, the axon sends output signals to cell 2 via the axon terminals. These axon terminals merge with the dendrites of cell 2.

5 Signals can be transmitted unchanged or they can be altered by synapses. A synapse is able to increase or decrease the strength of the connection from the neuron to neuron and cause excitation or inhibition of a subsequence neuron. This is where information is stored. The information processing abilities of biological neural systems must follow from highly parallel processes operating on representations that are distributed over many neurons. One motivation for ANN is to capture this kind of highly parallel computation based on distributed representations. 5

6 When a spike travels along an axon and arrives at a synapse it causes vesicles of transmitter chemical to be released –There are several kinds of transmitter The transmitter molecules diffuse across the synaptic cleft and bind to receptor molecules in the membrane of the post-synaptic neuron thus changing their shape. – This opens up holes that allow specific ions in or out. The effectiveness of the synapse can be changed – vary the number of vesicles of transmitter – vary the number of receptor molecules. Synapses are slow, but they have advantages over RAM –Very small –They adapt using locally available signals (but how?)

7 Each neuron receives inputs from other neurons -Some neurons also connect to receptors -Cortical neurons use spikes to communicate -The timing of spikes is important The effect of each input line on the neuron is controlled by a synaptic weight –The weights can be positive or negative The synaptic weights adapt so that the whole network learns to perform useful computations –Recognizing objects, understanding language, making plans, controlling the body You have about 10 neurons each with about 10 weights –A huge number of weights can affect the computation in a very short time. Much better bandwidth than pentium. 11 3

8 An ANN is composed of processing elements called or perceptrons, organized in different ways to form the network’s structure. An ANN consists of perceptrons. Each of the perceptrons receives inputs, processes inputs and delivers a single output. 8 The input can be raw input data or the output of other perceptrons. The output can be the final result (e.g. 1 means yes, 0 means no) or it can be inputs to other perceptrons.

9 9

10 10 Inputs: –Each input corresponds to a single attribute of the problem. –For example for the diagnosis of a disease, each symptom, could represent an input to one node. –Input could be image (pattern) of skin texture, if we are looking for the diagnosis of normal or cancerous cells. Outputs: –The outputs of the network represent the solution to a problem. –For diagnosis of a disease, the answer could be yes or no. Weights: –A key element of ANN is weight. –Weight expresses relative strength of the entering signal from various connections that transfers data from input point to the output point.

11 These are simple but computationally limited –If we can make them learn we may get insight into more complicated neurons 0 0 y

12 McCulloch-Pitts (1943): influenced Von Neumann! –First compute a weighted sum of the inputs from other neurons –Then send out a fixed size spike of activity if the weighted sum exceeds a threshold. –Maybe each spike is like the truth value of a proposition and each neuron combines truth values to compute the truth value of another proposition! 1 if 0 otherwise y z 1 0 threshold

13 0 otherwise y z 0 threshold These have a confusing name. They compute a linear weighted sum of their inputs The output is a non-linear function of the total input

14 These give a real-valued output that is a smooth and bounded function of their total input. –Typically they use the logistic function –They have nice derivatives which make learning easy. If we treat as a probability of producing a spike, we get stochastic binary neurons. 0.5 0 0 1

15 1 AND 10      X1 X2 Z AND w1 w2  w1, w2 harus dicari berdasarkan nilai w1, w2 awal yang diberikan Real world Neuron model

16 Result for stable weight Activation function

17 ANN learning is well-suited to problems in which the training data corresponds to noisy, complex sensor data. It is also applicable to problems for which more symbolic representations are used. It is appropriate for problems with the characteristics: –Input is high-dimensional discrete or real- valued (e.g. raw sensor input) –Output is discrete or real valued –Output is a vector of values –Possibly noisy data –Long training times accepted –Fast evaluation of the learned function required. –Not important for humans to understand the weights 17 Examples: Speech phoneme recognition Image classification Financial prediction Medical diagnosis

18 18

19 19

20 20

21 21

22 22

23 23

24 24 A NN is a machine learning approach inspired by the way in which the brain performs a particular learning task Knowledge about the learning task is given in the form of examples. Inter neuron connection strengths (weights) are used to store the acquired information (the training examples). During the learning process the weights are modified in order to model the particular learning task correctly on the training examples.


Download ppt "1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately 10 11 neurons, each connected to, on average, 10 4."

Similar presentations


Ads by Google