Neural modeling and simulation

Slides:



Advertisements
Similar presentations
Objectives Electrophysiology
Advertisements

Topic Nerves.
Introduction to Neural Networks
Example Project and Numerical Integration Computational Neuroscience 03 Lecture 11.
Neural Signaling: Postsynaptic Potentials Lesson 9.
Synapses and Multi Compartmental Models
Container Types in Python
Python for Science Shane Grigsby. What is python? Why python? Interpreted, object oriented language Free and open source Focus is on readability Fast.
BRIAN Simulator 11/4/11. NEURON is cool, but… …it’s not suited particularly well for large network simulations What if you want to look at properties.
Communication between cells. R I1I1 Biology Electrical equivalent I2I2 I = I 1 + I 2 I.
A brief introduction to neuronal dynamics Gemma Huguet Universitat Politècnica de Catalunya In Collaboration with David Terman Mathematical Bioscience.
Introduction to Matlab Workshop Matthew Johnson, Economics October 17, /13/20151.
Lecture 15 Orthogonal Functions Fourier Series. LGA mean daily temperature time series is there a global warming signal?
Mean = 75.1 sd = 12.4 range =
Dan Goodman & Romain Brette Ecole Normale Supérieure Projet Odyssée
Biological and Artificial Neurons Michael J. Watts
Part II: Population Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 6-9 Laboratory of Computational.
Marseille, Jan 2010 Alfonso Renart (Rutgers) Jaime de la Rocha (NYU, Rutgers) Peter Bartho (Rutgers) Liad Hollender (Rutgers) Néstor Parga (UA Madrid)
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
NERVOUS TISSUE. POLARIZED NEURON Intracellular Environment More Negative -70mV Sodium Potassium Facilitated Exchange Pump Activated Ionic Gates Closed.
Effects of Excitatory and Inhibitory Potentials on Action Potentials Amelia Lindgren.
Neural Condition: Synaptic Transmission
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
COGNITIVE SCIENCE 17 The Electric Brain Part 1 Jaime A. Pineda, Ph.D.
Action potentials of the world Koch: Figure 6.1. Lipid bilayer and ion channel Dayan and Abbott: Figure 5.1.
The Integrate and Fire Model Gerstner & Kistler – Figure 4.1 RC circuit Threshold Spike.
Concatenation MATLAB lets you construct a new vector by concatenating other vectors: – A = [B C D... X Y Z] where the individual items in the brackets.
By. What advantages has it? The Reasons for Choosing Python  Python is free  It is object-oriented  It is interpreted  It is operating-system independent.
Basic Models in Theoretical Neuroscience Oren Shriki 2010 Integrate and Fire and Conductance Based Neurons 1.
Neurons II CA6 – Theoretical Neuroscience Romain Brette
Simulating neural networks with Romain Brette
Computational Biology, Part 20 Neuronal Modeling Robert F. Murphy Copyright  1996, 1999, All rights reserved.
Biological Modeling of Neural Networks Week 4 – Reducing detail - Adding detail Wulfram Gerstner EPFL, Lausanne, Switzerland 4.2. Adding detail - synapse.
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
1 Lab of COMP 406 Teaching Assistant: Pei-Yuan Zhou Contact: Lab 1: 12 Sep., 2014 Introduction of Matlab (I)
Biological Modeling of Neural Networks Week 8 – Noisy input models: Barrage of spike arrivals Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
Top Score = 101!!!! Ms. Grundvig 2nd top score = 99 Mr. Chapman 3rd top score = Ms. Rodzon Skewness = -.57.
Romain Brette & Dan Goodman Ecole Normale Supérieure Equipe Audition
Physiology as the science. Defining of “physiology” notion Physiology is the science about the regularities of organisms‘ vital activity in connection.
Lecture 9: Introduction to Neural Networks Refs: Dayan & Abbott, Ch 7 (Gerstner and Kistler, Chs 6, 7) D Amit & N Brunel, Cerebral Cortex 7, (1997)
Lecture 7: Stochastic models of channels, synapses References: Dayan & Abbott, Sects 5.7, 5.8 Gerstner & Kistler, Sect 2.4 C Koch, Biophysics of Computation.
Lecture 2 Membrane potentials Ion channels Hodgkin-Huxley model References: Dayan and Abbott, Gerstner and Kistler,
Neural codes and spiking models. Neuronal codes Spiking models: Hodgkin Huxley Model (small regeneration) Reduction of the HH-Model to two dimensions.
Biological Modeling of Neural Networks Week 8 – Noisy output models: Escape rate and soft threshold Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
Resting Membrane Potential. Membrane Potentials  Electrical signals are the basis for processing information and neuronal response  The impulses are.
Theoretical Neuroscience Physics 405, Copenhagen University Block 4, Spring 2007 John Hertz (Nordita) Office: rm Kc10, NBI Blegdamsvej Tel (office)
INTRODUCTION OF INTEGRATE AND FIRE MODEL Single neuron modeling By Pooja Sharma Research Scholar LNMIIT-Jaipur.
Computing in carbon Basic elements of neuroelectronics Elementary neuron models -- conductance based -- modelers’ alternatives Wiring neurons together.
Dan Goodman & Romain Brette Ecole Normale Supérieure Projet Odyssée
Synaptic Transmission Classical –Mediated by Neurotransmitter Gated Ion Channel aka ionotropic receptors Neuromodulatory –Mediated by Metabotropic Receptors.
Structural description of the biological membrane. Physical property of biological membrane.
Neurons & Nervous Systems. nervous systems connect distant parts of organisms; vary in complexity Figure 44.1.
Brian A clock driven simulator for spiking neural networks in Python.
Structures and Processes of the Nervous System – Part 2
Neuronal Dynamics: Computational Neuroscience of Single Neurons
Introduction to Matlab Patrice Koehl Department of Biological Sciences National University of Singapore
Lecture 8: Integrate-and-Fire Neurons References: Dayan and Abbott, sect 5.4 Gerstner and Kistler, sects , 5.5, 5.6, H Tuckwell, Introduction.
Nens220, Lecture 11 Introduction to Realistic Neuronal Networks John Huguenard.
Do Now 1/9/15 1.Name 3 glial cells and describe their function and location. 2.Which neural pathway transmits a signal when the internal body temperature.
The Patch Clamp Method 1976 by Erwin Neher and Bert Sakmann at the Max Planck Institute in Goettingen.
A mathematical model of a neuron
Biological Neural Networks
How and why neurons fire
NOTES - UNIT 5 part 2: Action Potential: Conducting an Impulse
Nerve cell membrane Electrochemical message is created by the movement of ions across the nerve cell membrane The resting nerve membrane has a electrical.
Dayan Abbot Ch 5.1-4,9.
Neural Condition: Synaptic Transmission
Cell Diversity.
Shunting Inhibition Modulates Neuronal Gain during Synaptic Excitation
Neural Condition: Synaptic Transmission
Presentation transcript:

Neural modeling and simulation with Romain Brette Ecole Normale Supérieure romain.brette@ens.fr http://briansimulator.org

Spiking neuron models Output = 1 spike train Input = N spike trains Or: phenomenological description of neurons at spike level A neuron model is defined by: what happens when a spike is received the condition for spiking what happens when a spike is produced what happens between spikes discrete events continuous dynamics

A neural network with Ci P Pi Pe Ce from brian import * eqs=''' dv/dt = (ge+gi-(v+49*mV))/(20*ms) : volt dge/dt = -ge/(5*ms) : volt dgi/dt = -gi/(10*ms) : volt ''‘ P=NeuronGroup(4000,model=eqs, threshold=-50*mV,reset=-60*mV) P.v=-60*mV+10*mV*rand(len(P)) Pe=P.subgroup(3200) Pi=P.subgroup(800) Ce=Connection(Pe,P,'ge',weight=1.62*mV,sparseness=0.02) Ci=Connection(Pi,P,'gi',weight=-9*mV,sparseness=0.02) M=SpikeMonitor(P) run(1*second) raster_plot(M) show() Ci P Pi Pe Ce

Part I –Neurons

Equivalent electrical circuit = capacitance leak or resting potential Linear approximation of leak current: I = gL(Vm-EL) Circuit équivalent (Hille) Dayan & Abbott p 159 Au tableau: explication simple (mais fausse): courant due à la force électrique prop. Vm + courant de diffusion constant (prop. à différence de concentrations, loi de Fick) -> affine leak conductance = 1/R membrane resistance EL -70 mV : the membrane is « polarized » (Vin < Vout)

The membrane equation Iinj outside Iinj Vm inside Equa diff linéaire (affine) tau = 10*ms R = 50*Mohm EL = -70*mV Iinj = 0.5*nA eqs=''' dvm/dt = (EL-vm+R*Iinj)/tau : volt ''‘ membrane time constant (typically 3-100 ms)

The integrate-and-fire model action potential « postsynaptic potential » PSP spike threshold « Integrate-and-fire »: If V = Vt (threshold) then: neuron spikes and V→Vr (reset) (phenomenological description of action potentials)

Current-frequency relationship from brian import * N = 1000 tau = 10 * ms eqs = ''' dv/dt=(v0-v)/tau : volt v0 : volt ''' group = NeuronGroup(N, model=eqs, threshold=10 * mV, reset=0 * mV) group.v = 0 * mV group.v0 = linspace(0 * mV, 20 * mV, N) counter = SpikeCounter(group) duration = 5 * second run(duration) plot(group.v0 / mV, counter.count / duration) show() Seuil, monotonie Se mesure in vitro (rq: près du seuil ça ne correspond pas, à cause de l’initiation du spike, trop simple) tau=10ms, vt=10mV

Refractory period Δ = refractory period from reset max 1/Δ from brian import * N = 1000 tau = 10 * ms eqs = ''' dv/dt=(v0-v)/tau : volt v0 : volt ''' group = NeuronGroup(N, model=eqs, threshold=10 * mV, reset=0 * mV, refractory=5 * ms) group.v = 0 * mV group.v0 = linspace(0 * mV, 20 * mV, N) counter = SpikeCounter(group) duration = 5 * second run(duration) plot(group.v0 / mV, counter.count / duration) show() Δ = refractory period (sur la courbe I-F: période réfractaire = 10ms) from reset max 1/Δ

Synaptic currents synaptic current Is(t) synapse postsynaptic neuron Hille p 171 Le courant est porté par des ions Dendrites: on voit ça dans un autre chapitre Is(t)

Idealized synapse Total charge Opens for a short duration Is(t)=Qδ(t) Dirac function EL charge totale = prop au nb d’ions qui passent dans le neurone postsynaptique Au tableau: pptés de la fonction de Dirac (distribution) résolution de l’équation (soit avec la formule générale, soit en calculant la discontinuité en 0) Spike-based notation: =w « synaptic weight » at t=0

Example: fully connected network from brian import * tau = 10 * ms v0 = 11 * mV N = 20 w = .1 * mV group = NeuronGroup(N, model='dv/dt=(v0-v)/tau : volt', threshold=10 * mV, reset=0 * mV) W = Connection(group, group, 'v', weight=w) group.v = rand(N) * 10 * mV S = SpikeMonitor(group) run(300 * ms) raster_plot(S) show() Possibly add: network graph + statemonitor

A more realistic synapse model Electrodiffusion: ionic channel conductance synaptic reversal potential gs(t) Quelle est cette charge qui passe à travers le canal synaptique? Le canal est perméable à certains types d’ions. open closed presynaptic spike « conductance-based integrate-and-fire model »

Example of kinetic model Stochastic transitions between open and closed opening rate, proportional to concentration α[L] C ⇄ O β constant closing rate Macroscopic equation (many channels): proportion of open channels transformation equa diff -> g(t) From Destexhe et al, Kinetic models of synaptic transmission, in Methods in neuronal modelling. x = proportion de canaux ouverts, [L] = concentration du ligand / neurotransmetteurs liés Expliquer l’équation au tableau gs(t)=x(t)*gmax Assuming neurotransmitter are present for a very short duration: τs=1/β

Example of kinetic model Post-synaptic effect: Incoming spike: Dessiner [L](t) (pulse) Ecrire dx/dt=-x/taus, x->x+(1-x)alpha k τs=1/β

Example: random network taum = 20 * ms taue = 5 * ms taui = 10 * ms Ee = 0 * mV Ei = -80 * mV El = -60 * mV eqs = ''' dv/dt = (El-v+ge*(Ee-v)+gi*(Ei-v))/taum : volt dge/dt = -ge/taue : 1 dgi/dt = -gi/taui : 1 ''‘ P = NeuronGroup(4000, model=eqs, threshold=10 * mvolt, \ reset=-60 * mvolt, refractory=5 * msecond) Pe = P.subgroup(3200) Pi = P.subgroup(800) we = 6. / 10. # excitatory synaptic weight (voltage) wi = 67. / 10. # inhibitory synaptic weight Ce = Connection(Pe, P, 'ge', weight=we, sparseness=0.02) Ci = Connection(Pi, P, 'gi', weight=wi, sparseness=0.02) P.v = (randn(len(P)) * 5 - 5) * mvolt P.ge = randn(len(P)) * 1.5 + 4 P.gi = randn(len(P)) * 12 + 20 run(1 * second) Ci P Pi Pe Ce (conductances in units of the leak conductance)

Linearization Linear approximation: non-linear ou good bad ok AMPA/NMDA (0 mV) good Bonne approximation pour excitation, mauvaise pour inhibition (surtout inhibition silencieuse) Au tableau: calculs expliquant bon/mauvais/acceptable VT ≈ -50 mV EL ≈ -70 mV bad GABA-A (-70 mV) GABA-B (-100 mV) ok

Example: random network currents from brian import * eqs=''' dv/dt = (ge+gi-(v+49*mV))/(20*ms) : volt dge/dt = -ge/(5*ms) : volt dgi/dt = -gi/(10*ms) : volt ''‘ P=NeuronGroup(4000,model=eqs, threshold=-50*mV,reset=-60*mV) P.v=-60*mV+10*mV*rand(len(P)) Pe=P.subgroup(3200) Pi=P.subgroup(800) Ce=Connection(Pe,P,'ge',weight=1.62*mV,sparseness=0.02) Ci=Connection(Pi,P,'gi',weight=-9*mV,sparseness=0.02) M=SpikeMonitor(P) run(1*second) raster_plot(M) show() Ci P Pi Pe Ce

The postsynaptic potential Postsynaptic potential (PSP) = response to a presynaptic spike for variable Vm(t). synaptic variables Toutes les variables à 0 pour t<0 PPS = Représentation intégrale = réponse impulsionnelle Vm(t) = PPSi(t) Spike at time t=0:

Temporal and spatial integration Response to a set of spikes {tij} ? Linearity: i = synapse j = spike number If the differential system is linear: Superposition principle Preuve au tableau: L’équa diff est vraie entre les impulsions (X=sum(X1,...,Xn) et dX/dt=sum(dX1/dt,etc)) Aux instants d’impulsions ça colle (X(tij+) (SRM) (example: the « spike response model »)

From integral to differential representation Experimental recordings = integral representation model? parametric estimation biexponental *Ecrire un système différentiel pour différents trucs: exp, beta, alpha *résoudre l’équa diff [si tau1>tau2, sinon on inverse] (tool: Laplace transform)

Voltage-gated channels: biophysics of spike initiation Na+ Cl- K+ depolarization (Vm ↑) Rest: Na+ channels are closed channels open: Na+ enters + K+ En quelques mots: cf chapitre sur le PA concept avancé: le seuil variable? cf chapitre sur le PA repolarization (Vm ↓) channels inactivate: no current

The sodium channels heterogeneous distribution of charges -> protein conformation can change with potential Two stable conformations: open and closed Na+ Cl- K+ Sodium enters when the « gate » is open

State transitions closed → open transition requires energy proportional to V Na+ Cl- K+ transition rate prop. to (transition probability in [t,t+dt] prop. to ) (check Hille) Travail d’une charge déplacée dans un potentiel (V = différence extra-intra) id. open → closed transition rate prop. to and ab<0

State transitions C ⇄ O α(V) β(V) opening rate α(V) C ⇄ O β(V) closing rate Au tableau Hille p 69 (courant potassium en patch clamp) Macroscopic equation (many channels): m = proportion of open channels

Kinetic equation time constant equilibrium value sigmoidal Expliquer tauinf sigmoidal

The sodium current reversal potential (= 50 mV) max. conductance (= all channels open)

The Hodgkin-Huxley model Nobel Prize 1963 Model of the squid giant axon 4 gates the sodium channel has 3 independent « gates »

The Hodgkin-Huxley model Dayan & Abbott p175

Other voltage-dependent channels Other channels open depending on potential. max conductance proportion of open channels time constant equilibrium value cf chapitre potentiel d’action expliquer: la molécule est chargée (distribution de charges), la diff de potentiel exerce une force sur la molécule Exemple avec K+ et overshoot: exemples/overshoot (fonction : adaptation) rouge = sans K+, bleu= avec K+ (Va = -60 mV, ka = 3 mV, EK=-90 mV) -> réduit la largeur des PSPs -> important par exemple dans le système auditif pour augmenter la précision Na+ (sodium) K+ (potassium) – many different types Ca2+ (calcium) many other types of channels

Example: random network eqs = ''' dv/dt = (gl*(El-v)+ge*(Ee-v)+gi*(Ei-v)-\ g_na*(m*m*m)*h*(v-ENa)-\ g_kd*(n*n*n*n)*(v-EK))/Cm : volt dm/dt = alpham*(1-m)-betam*m : 1 dn/dt = alphan*(1-n)-betan*n : 1 dh/dt = alphah*(1-h)-betah*h : 1 dge/dt = -ge/taue : siemens dgi/dt = -gi/taui : siemens alpham = 0.32*(mV**-1)*(13*mV-v+VT)/ \ (exp((13*mV-v+VT)/(4*mV))-1.)/ms : Hz betam = 0.28*(mV**-1)*(v-VT-40*mV)/ \ (exp((v-VT-40*mV)/(5*mV))-1)/ms : Hz alphah = 0.128*exp((17*mV-v+VT)/(18*mV))/ms : Hz betah = 4./(1+exp((40*mV-v+VT)/(5*mV)))/ms : Hz alphan = 0.032*(mV**-1)*(15*mV-v+VT)/ \ (exp((15*mV-v+VT)/(5*mV))-1.)/ms : Hz betan = .5*exp((10*mV-v+VT)/(40*mV))/ms : Hz ''' P = NeuronGroup(4000, model=eqs, threshold=EmpiricalThreshold(threshold= -20 * mV, refractory=3 * ms), implicit=True) trace = StateMonitor(P, 'v', record=[1, 10, 100])

Adaptation linearized K+ current membrane potential adaptation current from brian import * PG = PoissonGroup(1, 500 * Hz) eqs = ''' dv/dt = (-w-v)/(10*ms) : volt dw/dt = -w/(30*ms) : volt # the adaptation current ''' # The adaptation variable increases with each spike IF = NeuronGroup(1, model=eqs, threshold=20 * mV, reset='''v = 0*mV w += 3*mV ''') C = Connection(PG, IF, 'v', weight=3 * mV) MS = SpikeMonitor(PG, True) Mv = StateMonitor(IF, 'v', record=True) Mw = StateMonitor(IF, 'w', record=True) run(100 * ms) plot(Mv.times / ms, Mv[0] / mV) plot(Mw.times / ms, Mw[0] / mV) show() linearized K+ current membrane potential adaptation current

Threshold adaptation cortical neuron in vivo (V1) from brian import * eqs = ''' dv/dt = -v/(10*ms) : volt dvt/dt = (10*mV-vt)/(15*ms) : volt ''' reset = ''' v=0*mV vt+=3*mV IF = NeuronGroup(1, model=eqs, reset=reset, threshold='v>vt') IF.rest() PG = PoissonGroup(1, 500 * Hz) C = Connection(PG, IF, 'v', weight=3 * mV) Mv = StateMonitor(IF, 'v', record=True) Mvt = StateMonitor(IF, 'vt', record=True) run(100 * ms) plot(Mv.times / ms, Mv[0] / mV) plot(Mvt.times / ms, Mvt[0] / mV) show()

By the way: the IF model is not a bad model of cortical neurons Injected current (slice) Fast spiking cortical cell Embarrassingly parallel problem IF model with adaptive threshold (from INCF competition)

The precision of spike timing In cortical neurons in vitro The same constant current is injected 25 times. The timing of the first spike is reproducible. The timing of the 10th spike is not. Mainen & Sejnowski (1995)

With IF neurons gaussian white noise from brian import * N = 25 tau = 20 * ms sigma = .015 eqs_neurons = ''' dx/dt=(1.1-x)/tau+sigma*(2./tau)**.5*xi:1 ''' neurons = NeuronGroup(N, model=eqs_neurons, threshold=1, reset=0, refractory=5 * ms) spikes = SpikeMonitor(neurons) run(500 * ms) raster_plot(spikes) show() gaussian white noise

With fluctuating current The same temporally variable current is injected 25 times. Spike timing is reproductible even after 1 s. L’intégrateur parfait ne rend pas compte de cette propriété Mainen & Sejnowski (1995) (cortical neuron in vitro, somatic injection)

IF neurons and fluctuating current tau_input = 5 * ms input = NeuronGroup(1, model='dx/dt=-x/tau_input+(2./tau_input)**.5*xi:1') tau = 10 * ms sigma = .015 eqs_neurons = ''' dx/dt=(0.9+.5*I-x)/tau+sigma*(2./tau)**.5*xi:1 I : 1 ''' neurons = NeuronGroup(25, model=eqs_neurons, threshold=1, reset=0, refractory=5 * ms) neurons.I = linked_var(input,'x‘) spikes = SpikeMonitor(neurons) run(500 * ms) raster_plot(spikes) show()

Part II – Networks A few examples

Localization of preys by scorpions Inhibition of opposite neuron → more spikes on the source side Expliquer la structure neurones excitateurs/inhibiteurs (polar representation of firing rates) Conversion temporal code → rate code

Code Explain model Noise, Delays Custom connectivity Interesting plot (pylab) Sturzl, W., R. Kempter, and J. L. van Hemmen (2000). Theory of arachnid prey localization. Physical Review Letters 84 (24), 5668

Sound localization by coincidence detection: Jeffress model δ+dleft=dright synchronous inputs the neuron fires delay δ Jeffress + echo suppression Buerck 2007

Code Sound Receptors at the two ears Coincidence detectors Delay lines defaultclock.dt = .02 * ms sound = TimedArray(10 * randn(50000)) # white noise max_delay = 20 * cm / (300 * metre / second) angular_speed = 2 * pi * radian / second # 1 turn/second tau_ear = 1 * ms sigma_ear = .1 eqs_ears = ''' dx/dt=(sound(t-delay)-x)/tau_ear+sigma_ear*(2./tau_ear)**.5*xi : 1 delay=distance*sin(theta) : second distance : second # distance to the centre of the head in time units dtheta/dt=angular_speed : radian ''' ears = NeuronGroup(2, model=eqs_ears, threshold=1, reset=0, refractory=2.5 * ms) ears.distance = [-.5 * max_delay, .5 * max_delay] traces = StateMonitor(ears, 'x', record=True) N = 300 tau = 1 * ms sigma = .1 eqs_neurons = ''' dv/dt=-v/tau+sigma*(2./tau)**.5*xi : 1 neurons = NeuronGroup(N, model=eqs_neurons, threshold=1, reset=0) synapses = Connection(ears, neurons, 'v', structure='dense', delay=True, max_delay=1.1 * max_delay) synapses.connect_full(ears, neurons, weight=.5) synapses.delay[0, :] = linspace(0 * ms, 1.1 * max_delay, N) synapses.delay[1, :] = linspace(0 * ms, 1.1 * max_delay, N)[::-1] spikes = SpikeMonitor(neurons) Sound Receptors at the two ears Coincidence detectors Delay lines

Simulation of Jeffress model delays (sound turning around the head)

« Synfire chains »: propagation of synchronous activity Layers of excitatory neurons: each neuron = integrate-and-fire + noise (Diesmann et al, 1999) Neurons in layer 1 are simultaneously activated: propagation If fewer neurons are activated, no propagation (« Synfire chains »: term introduced par Abeles)

Synfire chains layer 1 layer 2 synchronous propagation attractor a = number of spikes standard deviation  attractor dissipation Trajectories in space (,a)

Code Vr = -70 * mV Vt = -55 * mV taum = 10 * ms taupsp = 0.325 * ms weight = 4.86 * mV eqs = ''' dV/dt=(-(V-Vr)+x)*(1./taum) : volt dx/dt=(-x+y)*(1./taupsp) : volt dy/dt=-y*(1./taupsp)+25.27*mV/ms+\ (39.24*mV/ms**0.5)*xi : volt''' # Neuron groups P = NeuronGroup(N=1000, model=eqs, threshold=Vt, reset=Vr, refractory=1 * ms) Pinput = PulsePacket(t=50 * ms, n=85, sigma=1 * ms) # The network structure Pgp = [ P.subgroup(100) for i in range(10)] C = Connection(P, P, 'y') for i in range(9): C.connect_full(Pgp[i], Pgp[i + 1], weight) Cinput = Connection(Pinput, Pgp[0], 'y') Cinput.connect_full(weight=weight) # Record the spikes Mgp = [SpikeMonitor(p) for p in Pgp] Minput = SpikeMonitor(Pinput) monitors = [Minput] + Mgp # Setup the network, and run it P.V = Vr + rand(len(P)) * (Vt - Vr) run(100 * ms)

Spontaneous activity in a ring tau = 10 * ms N = 100 v0 = 5 * mV sigma = 4 * mV group = NeuronGroup(N, model='dv/dt=(v0-v)/tau + sigma*xi/tau**.5 : volt', \ threshold=10 * mV, reset=0 * mV) C = Connection(group, group, 'v', weight=lambda i, j:.4*mV*cos(2*pi*(i-j)*1./N)) S = SpikeMonitor(group) R = PopulationRateMonitor(group) group.v = rand(N) * 10 * mV run(5000 * ms) subplot(211) raster_plot(S) subplot(223) imshow(C.W.todense(), interpolation='nearest') title('Synaptic connections') subplot(224) plot(R.times / ms, R.smooth_rate(2 * ms, filter='flat')) title('Firing rate') show()

Part III - Plasticity

Hebb’s rule A B Neuron A and neuron B are active: wAB increases When an axon of cell A is near enough to excite cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A's efficiency, as one of the cells firing B, is increased. (1949) A B D. Hebb Neuron A and neuron B are active: wAB increases Physiologically: « synaptic plasticity » PSP size is increased PSP (or: transmission probability is increased)

Synaptic plasticity at spike level (STDP = Spike-Timing-Dependent Plasticity) Presynaptic spike pre  post: potentiation post  pre: depression Dan & Poo (2006) Postsynaptic spike causal rule favors synchronous inputs

Phenomenological model Presynaptic spike: Postsynaptic spike:

Synaptic plasticity with Brian synapses=Connection(input,neurons,'ge')  eqs_stdp=''' dA_pre/dt=-A_pre/tau_pre : 1 dA_post/dt=-A_post/tau_post : 1 ''‘ stdp=STDP(synapses,eqs=eqs_stdp,pre='A_pre+=dA_pre;w+=A_post', post='A_post+=dA_post;w+=A_pre',wmax=gmax) maximum weight synapses=Connection(input,neurons,'ge')  stdp=ExponentialSTDP(synapses,tau_pre,tau_post,dA_pre,dA_post, wmax=gmax) relative to wmax:

Complete code This equations-oriented approach applies to plasticity (STDP and STP). Lots of different rules for STDP: give the equations (also predefined) Solves standardization issue. Song, S., K. D. Miller, and L. F. Abbott (2000). Competitive hebbian learning through spike timing-dependent synaptic plasticity. Nature Neurosci 3, 919-26.

A few properties of STDP Stability if: Stationary distribution is bimodal (depression > potentiation) démo stabilité avec trains indep démo bimodale avec corrélations (inputs = Poisson spike trains) N.B.: not bimodal if weight modification is multiplicative competitive mechanism

Properties (2): Correlated inputs are favored not correlated Stationary synaptic weights

Properties (3): After convergence, firing is irregular (balanced regime)

Short-term synaptic plasticity Synaptic efficacy depends on recent activity facilitation (typically : exc  inh) depression (typically: exc  exc) Dépression (E-> E) Facilitation (E-> I) Dayan & Abbott 188. Markram & Tsodyks 1996, Markram et al 1998.

Phenomenological model x = synaptic « resources » u = proportion of resources consumed by a spike depression facilitation Synaptic efficacy: Presynaptic spike: decreases by u*x (resource consumption) increases (sensitization) STP: Tsodyks model x = « ressources » synaptiques u = proportion des ressources utilisée With Brian: synapses=Connection(input,neurons,'ge')  stp=STP(synapses,taud=50*ms,tauf=1*ms,U=0.6)

Example: facilitation tau_e = 3 * ms taum = 10 * ms A_SE = 250 * pA Rm = 100 * Mohm N = 10 eqs = ''' dx/dt=rate : 1 rate : Hz''' input = NeuronGroup(N, model=eqs, threshold=1., reset=0) input.rate = linspace(5 * Hz, 30 * Hz, N) eqs_neuron = ''' dv/dt=(Rm*i-v)/taum:volt di/dt=-i/tau_e:amp ''' neuron = NeuronGroup(N, model=eqs_neuron) C = Connection(input, neuron, 'i') C.connect_one_to_one(weight=A_SE) stp = STP(C, taud=1 * ms, tauf=100 * ms, U=.1 trace = StateMonitor(neuron, 'v', record=[0, N - 1]) run(1000 * ms) subplot(211) plot(trace.times / ms, trace[0] / mV) subplot(212) plot(trace.times / ms, trace[N - 1] / mV) show() regular spike trains

Python in 15 minutes see also: http://docs.python.org/tutorial/ faire une démo avec ipython ou idle

The Python console On Windows, open IDLE. On Linux: type python interpreted language dynamic typing garbage collector space matters (signals structure) object-oriented many libraries

Writing a script If you use IDLE, click on File>New window. Otherwise use any text editor. Press F5 to execute

A simple program comment # Factorial function def factorial(x): if x == 0: return 1 else: return x * factorial(x-1) print factorial(5) function definition untyped argument condition structure by indentation (block = aligned instructions) function call display

Numerical objects Base types: int, long, float, complex Other numerical types (vectors, matrices) defined in the Numpy library (in a few minutes) x=3+2 x+=1 y=100L z=x*(1+2j) u=2.3/7 x,y,z = 1,2,3 a = b = 123 http://hetland.org/writing/instant-python.html

Control structures x = 12 if x < 5 or (x > 10 and x < 20): print "Good value." if x < 5 or 10 < x < 20: print ‘Good value as well.' for i in [0,1,2,3,4]: print "Number", i for i in range(5): while x >= 0: print "x is not always negative." x = x-1 list the same list les deux points, l’indentation les guillemets ou les apostrophes liste, range

Lists mylist = [1,7,8,3] name = ["Jean","Michel"] x = [1,2,3,[7,8],"fin"] heterogeneous list first element = index 0 print name[0] name[1]="Robert" print mylist[1:3] print mylist[:3],mylist[:] print mylist[-1] print x[3][1] « slice »: index 3 not included last element x[3] is a list montrer aussi liste2=liste, liste2[1]=0, print liste method (list = object) name.append("Georges") print mylist+name print mylist*2 concatenate Other methods: extend, insert, reverse, sort, remove…

List comprehensions carres=[x**2 for i in range(10)] pairs=[x for i in range(10) if i % 2 == 0] = list of squares of integers from 0 to 9 = list of even integers between 0 and 9

Strings a="Bonjour" b='hello' c=""" Une phrase qui n'en finit pas """ print a+b print a[3:7] print b.capitalize() multiline string ≈ list of characters many methods (find, replace, split…)

Dictionaries dico={‘one':1,‘two':2,‘three':‘several'} print dico[‘three'] dico[‘four']=‘many‘ del dico[‘one'] key value for key in dico: print key,'->',dico[key] nombreuses méthodes également iterate all keys

Functions def power(x,exponent=2): return x**exponent print power(3,2) print power(exponent=3,x=2) default value call with named arguments carre=lambda x:x**2 inline definition

Modules loads the file ‘math.py’ or ‘math.pyc’ (compiled) import math print math.exp(1) from math import exp print exp(1) from math import * import only the exp object import everything You can work with several files (= modules), each one can define any number of objects (= variables, functions, classes) from math import *

Scipy & Pylab

Scipy & Numpy Scientific libraries Syntax ≈ Matlab Many mathematical functions from scipy import * x=array([1,2,3]) M=array([[1,2,3],[4,5,6]]) M=ones((3,3)) z=2*x y=dot(M,x) from scipy.optimize import * print fsolve(lambda x:(x-1)*(x-3),2) vector matrix matrix product

Vectors et matrices Base type in SciPy: array (= vector or matrix) from scipy import * x=array([1,2,3]) M=array([[1,2,3],[4,5,6]]) M=ones((3,2)) z=2*x+1 y=dot(M,x) vector (1,2,3) 1 2 3 4 5 6 matrix 1 1 matrix On initialise avec des listes matrix product

Operations x+y x-y x*y x/y x**2 exp(x) element-wise sqrt(x) dot(x,y) dot(M,x) M.T M.max() M.sum() size(x) M.shape element-wise x² dot product matrix product transpose total number of elements

Indexing Vector indexing  lists (first element= 0) x[i] M[i,j] x[i:j] M[1,:]+=x (i+1)th element slice from x[i] to x[j-1] (i+1)th row (i+1)th column elements x[1] and x[3] x[1]=0, x[3]=1 add vector x to the 2nd row of M M[i,:] is a « view » on matrix M  copy ( reference) y=M[0,:] y[2]=5 x=z x[1]=3 M[0,2] is 5 copy: z[1] is 3 x=z.copy()

Construction x=array([1,2,3]) M=array([[1,2,3],[4,5,6]]) x=ones(5) M=zeros((3,2)) M=eye(3) M=diag([1,3,7]) x=rand(5) x=randn(5) x=arange(10) x=linspace(0,1,100) from lists vector of 1s zero matrix identity matrix diagonal matrix random vector in (0,1) random gaussian vector 0,1,2,...,9 100 numbers between 0 and 1

SciPy example: optimisation Simple example, least squares polynomial fit

Vectorisation How to write efficient programs? Replace loops by vector operations for i in range(1000000): X[i]=1 X=ones(1000000) for i in range(1000000): X[i]=X[i]*2 X=X*2 for i in range(999999): Y[i]=X[i+1]-X[i] Y=X[1:]-X[:-1] chaque opération Python doit être interprétée for i in range(1000000): if X[i]>0.5: Y[i]=1 Y[X>.5]=1

Pylab

Pylab Plotting library Syntax ≈ Matlab Many plotting functions plot from pylab import * plot([1,2,3],[4,5,6]) show() x y last instruction of script polar more examples: http://matplotlib.sourceforge.net/gallery.html

Plotting with PyLab hist(randn(1000)) contour(gaussian_filter(randn(100, 100), 5)) specgram(sin(10000*2*pi*linspace(0,1,44100)), Fs=44100) imshow(gaussian_filter(randn(100, 100), 5))

Brian At last!

Online help

Books Theoretical neuroscience, by Dayan & Abbott, MIT Press Spiking neuron models, by Gerstner & Kistler, Cambridge Univerity Press Dynamical systems in neuroscience, by Izhikevich, MIT Press Biophysics of computation, by Koch, Oxford University Press Introduction to Theoretical Neurobiology, by Tuckwell, Cambridge University Press romain.brette@ens.fr