Presentation is loading. Please wait.

Presentation is loading. Please wait.

Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 1 Michael Arbib: CS564 - Brain Theory and.

Similar presentations


Presentation on theme: "Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 1 Michael Arbib: CS564 - Brain Theory and."— Presentation transcript:

1 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 1 Michael Arbib: CS564 - Brain Theory and Artificial Intelligence Lecture 3 The Brain as a Network of Neurons Reading Assignments:* TMB2: Section 2.3 HBTNN: Single Cell Models (Softky and Koch) Axonal Modeling (Koch and Bernander) Perspective on Neuron Model Complexity (Rall) * Unless indicated otherwise, the TMB2 material is the required reading, and the other readings supplementary.

2 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 2 The "basic" biological neuron The soma and dendrites act as the input surface; the axon carries the outputs. The tips of the branches of the axon form synapses upon other neurons or upon effectors (though synapses may occur along the branches of an axon as well as the ends). The arrows indicate the direction of "typical" information flow from inputs to outputs.

3 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 3 From Passive to Active Propagation For "short" cells passive propagation suffices to signal a potential change from one end to the other; If the axon is long, this is inadequate since changes at one end would decay away almost completely before reaching the other end. If the change in potential difference is large enough, then in a cylindrical configuration such as the axon, a pulse can actively propagate at full amplitude. The Hodgkin-Huxley Equations (1952)

4 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 4 Excitatory and Inhibitory Synapses Dale's law states that each neuron releases a single transmitter substance. (A “first approximation”) This does not mean that the synapses made by a single neuron are either all excitatory or all inhibitory. Modern understanding: Channels which "open" and "close"provide the mechanisms for the Hodgkin-Huxley equation, and this notion of channels extends to synaptic transmission. The action of a synapse depends on both transmitter released presynaptically, and specialized receptors in the postsynaptic membrane. Moreover, neurons may secrete transmitters which act as neuromodulators of the function of a circuit on some quite extended time scale (cf. TMB2 Sections 6.1 and 8.1).

5 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 5 A Synapse

6 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 6 Down the Levels of Detail to Neuron and Synapse A small piece of cerebellum A Purkinje cell Just a few of the synapses of parallel fibres on a branch of the dendritic tree

7 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 7 The Neurochemistry of Synaptic Plasticity: The Grounding of Learning and Memory Eric Kandel’s view of how molecular changes in a synapse may produce "short term memory" and "long term memory" in Aplysia. [cf. TMB2 Section 8.1] The Nobel Prize in Physiology or Medicine for 2000 was awarded jointly to Arvid Carlsson, Paul Greengard and Eric Kandel

8 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 8 Approximations are crucial 10 11 neurons with 10 4 or more synapses, each of which is a complex chemical machine A general challenge:  Using computer and mathematical analysis to validate simplifications which allow us to find the right projection of the complexity of units at one level of analysis to provide components for large scale models at the next level of analysis.

9 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 9 Multi-Level Modeling and Data Analysis McCulloch-Pitts Neural Networks (Logic and computability theory) Hodgkin-Huxley model of the action potential (spike propagation) (Nonlinear differential equations)

10 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 10 From Mind to Neural Networks Warren McCulloch “What is a man that he may know a number, and a number that a man may know it?” A philosophy-driven approach: Inspired by Kant and Leibnitz, seeking to map logic onto neural function Mathematics: Mathematical logic (propositional logic; Turing’s theory of computability)

11 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 11 A McCulloch-Pitts neuron operates on a discrete time-scale, t = 0,1,2,3,... with time tick equal to one refractory period At each time step, an input or output is on or off — 1 or 0, respectively. Each connection or synapse from the output of one neuron to the input of another, has an attached weight. Warren McCulloch and Walter Pitts (1943)

12 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 12 Excitatory and Inhibitory Synapses We call a synapse excitatory if w i > 0, and inhibitory if w i < 0. We also associate a threshold  with each neuron A neuron fires (i.e., has value 1 on its output line) at time t+1 if the weighted sum of inputs at t reaches or passes  : y(t+1) = 1 if and only if  w i x i (t)  .

13 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 13 From Logical Neurons to Finite Automata AND 1 1 1.5 NOT 0 OR 1 1 0.5 Brains, Machines, and Mathematics, 2nd Edition, 1987 X Y  Boolean Net X Y Q Finite Automaton

14 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 14 Philosophically and Technologically Important A major attack on dualism  The “brain” of a Turing machine A good global view of the input-output computational capacity of neural networks An important basis for the technology of artificial neural networks  with the addition of learning rules But not a neuron-by-neuron account of the brain’s functions:  Logic is a culturally late activity of large neural populations, not a direct expression of neural function.

15 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 15 Increasing the Realism of Neuron Models The McCulloch-Pitts neuron of 1943 is important as a basis for:  logical analysis of the neurally computable, and  current design of some neural devices (especially when augmented by learning rules to adjust synaptic weights). However, it is no longer considered a useful model for making contact with neurophysiological data concerning real neurons.

16 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 16 Alan Hodgkin and Andrew Huxley “What are the dynamics of the axon potential (spike propagation along the axon?)” Mathematics: Ordinary differential equations A data-driven approach:  Giant squid axon   massive data   curve fitting   differential equations that elegantly describe these curves Then later mathematics and computer analysis explore the far-ranging implications of the Hodgkin-Huxley equations Hodgkin & Huxley Photos from http://www.nobel.se/medicine/laureates/1963/press.htmlhttp://www.nobel.se/medicine/laureates/1963/press.html

17 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 17 From Curve Fitting to Nonlinear Analysis The Hodgkin-Huxley Equations: 4 coupled differential equations C dV/dt = - g L (V - V L ) + I a - g Na m 3 h(V - V Na ) - g K n 4 (V - V K ) L: Leakage Na: Sodium K: Potassium where dm/dt =  m (V) (1 - m) -  m (V) m dn/dt =  n (V) (1 - n) -  n (V) n dh/dt =  h (V) (1 - h) -  h (V) h with the  s and  s found by curve-fitting and the forms m 3 h and n 4 add a deep intuitive understanding that anticipates the structure of channels. Many properties of these 4-variable Hodgkin-Huxley equations can be explored analytically or qualitatively by two-dimensional differential equations.  spontaneous spiking  spiking which follows the input stimulus  bursting

18

19 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 19 From Hodgkin-Huxley to the Neuron Compartmental Modeling  Hundreds of compartments per neuron Dendrites, soma and cell bodies Synapses versus  A very few compartments that capture neural essentials Traub and Miles on neurons in the hippocampus Mainen and Sejnowski on pyramidal cells of cortex.

20 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 20 Leaky Integrator Neuron The simplest "realistic" neuron model is a single-compartment continuous time model based on the firing rate (e.g., the number of spikes traversing the axon in the most recent 20 msec.) as a continuously varying measure of the cell's activity The state of the neuron is described by a single variable, the membrane potential. The firing rate is approximated by a sigmoid, function of membrane potential.

21 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 21 Leaky Integrator Model  = - m(t) + h has solution m(t) = e -t/  m(0) + (1 - e -t/  )h  h for time constant  > 0. We now add synaptic inputs to get the Leaky Integrator Model:  = - m(t) +  i w i X i (t) + h where X i (t) is the firing rate at the i th input. Excitatory input (w i > 0) will increase Inhibitory input (w i < 0) will have the opposite effect. m(t)

22 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 22 Rall’s Motion Detector Model

23 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 23 Alternative Models Even at this simple level, there are alternative models. There are inhibitory synapses which seem better described by shunting inhibition which, applied at a given point on a dendrite, serves to divide, rather than subtract from, the potential change passively propagating from more distal synapses. The "lumped frequency" model cannot model the subtle relative timing effects crucial to our motion detector example — these might be approximated by introducing appropriate delay terms  m(t)  = - m(t) +  i w i x i (t -  i ) + h.

24 Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 24 No modeling approach is automatically appropriate Rather we seek to find the simplest model adequate to address the complexity of a given range of problems. The Neural Simulation Language NSLJ provides tools for modeling complex neural systems - especially (but not only) when the neurons are modeled as leaky integrator neurons.


Download ppt "Michael Arbib CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2001. Lecture 3 Networks of Neurons 1 Michael Arbib: CS564 - Brain Theory and."

Similar presentations


Ads by Google