Download presentation

Presentation is loading. Please wait.

Published byNicolette Pascoe Modified over 2 years ago

1
Phantom Limb Phenomena

2
Hand movement observation by individuals born without hands: phantom limb experience constrains visual limb perception. Funk M, Shiffrar M, Brugger P. Funk MShiffrar MBrugger P We investigated the visual experiences of two persons born without arms, one with and the other without phantom sensations. Normally-limbed observers perceived rate-dependent paths of apparent human movement. The individual with phantom experiences showed the same perceptual pattern as control participants, the other did not. Neural systems matching action observation, action execution and motor imagery are likely contribute to the definition of body schema in profound ways.

3
Summary Both genetic factors and activity dependent factors play a role in developing the brain architecture and circuitry. There are critical developmental periods where nurture is essential, but there is also a great ability for the adult brain to regenerate. Next lecture: What computational models satisfy some of the biological constraints. Question: What is the relevance of neural development and learning in language and thought?

4
Connectionist Models: Basics Jerome Feldman CS182/CogSci110/Ling109 Spring 2007

6
Realistic Biophysical Neuron Simulations Not covered in any UCB class? Genesis and Neuron systems

7
Neural networks abstract from the details of real neurons Conductivity delays are neglected An output signal is either discrete (e.g., 0 or 1) or it is a real-valued number (e.g., between 0 and 1) Net input is calculated as the weighted sum of the input signals Net input is transformed into an output signal via a simple function (e.g., a threshold function)

8
The McCullough-Pitts Neuron y j : output from unit j W ij : weight on connection from j to i x i : weighted sum of input to unit i xixi f yjyj w ij yiyi x i = ∑ j w ij y j y i = f(x i – i) t i : target Threshold

9
Mapping from neuron Nervous SystemComputational Abstraction NeuronNode DendritesInput link and propagation Cell BodyCombination function, threshold, activation function AxonOutput link Spike rateOutput Synaptic strengthConnection strength/weight

10
Simple Threshold Linear Unit

11
Simple Neuron Model 1

12
A Simple Example a = x 1 w 1 +x 2 w 2 +x 3 w 3... +x n w n a= 1*x1 + 0.5*x2 +0.1*x3 x1 =0, x2 = 1, x3 =0 Net(input) = f = 0.5 Threshold bias = 1 Net(input) – threshold bias< 0 Output = 0.

13
Simple Neuron Model 1 1 1 1

14
1 1 1 1 1

15
1 0 1 1

16
1 0 1 1 0

17
Different Activation Functions Threshold Activation Function (step) Piecewise Linear Activation Function Sigmoid Activation Funtion Gaussian Activation Function Radial Basis Function BIAS UNIT With X0 = 1

18
Types of Activation functions

19
The Sigmoid Function x=neti y=a

20
The Sigmoid Function x=neti y=a Output=0 Output=1

21
The Sigmoid Function x=neti y=a Output=0 Output=1 Sensitivity to input

22
Changing the exponent k(neti) K >1 K < 1

23
Nice Property of Sigmoids

24
Radial Basis Function

25
Stochastic units Replace the binary threshold units by binary stochastic units that make biased random decisions. The “temperature” controls the amount of noise temperature

26
Types of Neuron parameters The form of the input function - e.g. linear, sigma-pi (multiplicative), cubic. The activation-output relation - linear, hard- limiter, or sigmoidal. The nature of the signals used to communicate between nodes - analog or boolean. The dynamics of the node - deterministic or stochastic.

27
Computing various functions McCollough-Pitts Neurons can compute logical functions. AND, NOT, OR

28
Computing other functions: the OR function Assume a binary threshold activation function. What should you set w 01, w 02 and w 0b to be so that you can get the right answers for y 0 ? i1i1 i2i2 y0y0 000 011 101 111 x0x0 f i1i1 w 01 y0y0 i2i2 b=1 w 02 w 0b

29
Many answers would work y = f (w 01 i 1 + w 02 i 2 + w 0b b) recall the threshold function the separation happens when w 01 i 1 + w 02 i 2 + w 0b b = 0 move things around and you get i 2 = - (w 01/ w 02 )i 1 - (w 0b b/w 02 ) i2i2 i1i1

30
Decision Hyperplane The two classes are therefore separated by the `decision' line which is defined by putting the activation equal to the threshold. It turns out that it is possible to generalise this result to TLUs with n inputs. In 3-D the two classes are separated by a decision-plane. In n-D this becomes a decision-hyperplane.

31
Linearly separable patterns Linearly Separable Patterns PERCEPTRON is an architecture which can solve this type of decision boundary problem. An "on" response in the output node represents one class, and an "off" response represents the other.

32
The Perceptron

33
Input Pattern

34
The Perceptron Input Pattern Output Classification

35
A Pattern Classification

36
Pattern Space The space in which the inputs reside is referred to as the pattern space. Each pattern determines a point in the space by using its component values as space- coordinates. In general, for n-inputs, the pattern space will be n-dimensional. Clearly, for nD, the pattern space cannot be drawn or represented in physical space. This is not a problem: we shall return to the idea of using higher dimensional spaces later. However, the geometric insight obtained in 2-D will carry over (when expressed algebraically) into n- D.

37
The XOR Function X1/X2X2 = 0X2 = 1 X1= 001 X1 = 110

38
The Input Pattern Space

39
The Decision planes

40
Multi-layer Feed-forward Network

41
Pattern Separation and NN architecture

42
Conjunctive or Sigma-Pi nodes The previous spatial summation function supposes that each input contributes to the activation independently of the others. The contribution to the activation from input 1 say, is always a constant multiplier ( w1) times x1. Suppose however, that the contribution from input 1 depends also on input 2 and that, the larger input 2, the larger is input 1's contribution. The simplest way of modeling this is to include a term in the activation like w12(x1*x2) where w12>0 (for a inhibiting influence of input 2 we would, of course, have w12<0 ). w1*x1 + w2*x2 +w3*x3 + w12*(x1*x2) + w23(x2*x3) +w13*(x1*x3)

43
Sigma-Pi units

44
Sigma-Pi Unit

45
Biological Evidence for Sigma-Pi Units [axo-dendritic synapse] The stereotypical synapse consists of an electro-chemical connection between an axon and a dendrite - hence it is an axo-dendritic synapse [presynaptic inhibition] However there is a large variety of synaptic types and connection grouping. Of special importance are cases where the efficacy of the axo-dendritic synapse between axon 1 and the dendrite is modulated (inhibited) by the activity in axon 2 via the axo-axonic synapse between the two axons. This might therefore be modelled by a quadratic like w12(x1*x2) [synapse cluster] Here the effect of the individual synapses will surely not be independent and we should look to model this with a multilinear term in all the inputs.

46
Biological Evidence for Sigma-Pi units [axo-dendritic synapse] [presynaptic inhibition] [synapse cluster]

47
Link to Vision: The Necker Cube

49
physicslowest energy state chemistrymolecular minima biology fitness, MEU N euroeconomics vision threats, friends language errors, NTL Constrained Best Fit in Nature inanimate animate

50
Computing other relations The 2/3 node is a useful function that activates its outputs (3) if any (2) of its 3 inputs are active Such a node is also called a triangle node and will be useful for lots of representations.

51
Triangle nodes and McCullough-Pitts Neurons? BC A ABC

52
Representing concepts using triangle nodes triangle nodes: when two of the neurons fire, the third also fires

53
“They all rose” triangle nodes: when two of the neurons fire, the third also fires model of spreading activation

54
Basic Ideas behind the model Parallel activation streams. Top down and bottom up activation combine to determine the best matching structure. Triangle nodes bind features of objects to values Mutual inhibition and competition between structures Mental connections are active neural connections

55
5 levels of Neural Theory of Language Cognition and Language Computation Structured Connectionism Computational Neurobiology Biology MidtermQuiz Finals Neural Development Triangle Nodes Neural Net Spatial Relation Motor Control Metaphor SHRUTI Grammar abstraction

56
Can we formalize/model these intuitions What is a neurally plausible computational model of spreading activation that captures these features. What does semantics mean in neurally embodied terms What are the neural substrates of concepts that underlie verbs, nouns, spatial predicates?

Similar presentations

OK

Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.

Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on 3d printing pen Adrenal gland anatomy and physiology ppt on cells Ppt on diabetes type 2 Ppt on frame relay and atm Ppt on conservation of land resources Ppt on physical and chemical changes for class 6 Ppt on union budget 2013-14 Viewer ppt online student Ppt on social contract theory of government Ppt on central limit theorem youtube