Lecture 11: Networks II: conductance-based synapses, visual cortical hypercolumn model References: Hertz, Lerchner, Ahmadi, q-bio.NC/0402023 [Erice lectures]

Slides:



Advertisements
Similar presentations
Example Project and Numerical Integration Computational Neuroscience 03 Lecture 11.
Advertisements

Chapter 2.
by Michael Anthony Repucci
Central Visual Processes. Anthony J Greene2 Central Visual Pathways I.Primary Visual Cortex Receptive Field Columns Hypercolumns II.Spatial Frequency.
Lecture 13: Associative Memory References: D Amit, N Brunel, Cerebral Cortex 7, (1997) N Brunel, Network 11, (2000) N Brunel, Cerebral.
III-28 [122] Spike Pattern Distributions in Model Cortical Networks Joanna Tyrcha, Stockholm University, Stockholm; John Hertz, Nordita, Stockholm/Copenhagen.
Learning crossmodal spatial transformations through STDP Gerhard Neumann Seminar B, SS 06.
Impact of Correlated inputs on Spiking Neural Models Baktash Babadi Baktash Babadi School of Cognitive Sciences PM, Tehran, Iran PM, Tehran, Iran.
Why are cortical spike trains irregular? How Arun P Sripati & Kenneth O Johnson Johns Hopkins University.
1 3. Spiking neurons and response variability Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Part II: Population Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 6-9 Laboratory of Computational.
Basic Models in Theoretical Neuroscience Oren Shriki 2010 Synaptic Dynamics 1.
Marseille, Jan 2010 Alfonso Renart (Rutgers) Jaime de la Rocha (NYU, Rutgers) Peter Bartho (Rutgers) Liad Hollender (Rutgers) Néstor Parga (UA Madrid)
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
Biological Modeling of Neural Networks: Week 11 – Continuum models: Cortical fields and perception Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Transients.
Reading population codes: a neural implementation of ideal observers Sophie Deneve, Peter Latham, and Alexandre Pouget.
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
Part I: Single Neuron Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 2-5 Laboratory of Computational.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Connected Populations: oscillations, competition and spatial continuum (field equations) Lecture 12 Course: Neural Networks and Biological Modeling Wulfram.
Ising Models for Neural Data John Hertz, Niels Bohr Institute and Nordita work done with Yasser Roudi (Nordita) and Joanna Tyrcha (SU) Math Bio Seminar,
Another viewpoint: V1 cells are spatial frequency filters
Bump attractors and the homogeneity assumption Kevin Rio NEUR April 2011.
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
1 Computational Vision CSCI 363, Fall 2012 Lecture 10 Spatial Frequency.
Biological Modeling of Neural Networks Week 8 – Noisy input models: Barrage of spike arrivals Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
Lecture 9: Introduction to Neural Networks Refs: Dayan & Abbott, Ch 7 (Gerstner and Kistler, Chs 6, 7) D Amit & N Brunel, Cerebral Cortex 7, (1997)
fMRI Methods Lecture 12 – Adaptation & classification
The Ring Model of Cortical Dynamics: An overview David Hansel Laboratoire de Neurophysique et Physiologie CNRS-Université René Descartes, Paris, France.
Chapter 7. Network models Firing rate model for neuron as a simplification for network analysis Neural coordinate transformation as an example of feed-forward.
Theoretical Neuroscience Physics 405, Copenhagen University Block 4, Spring 2007 John Hertz (Nordita) Office: rm Kc10, NBI Blegdamsvej Tel (office)
Lecture 21 Neural Modeling II Martin Giese. Aim of this Class Account for experimentally observed effects in motion perception with the simple neuronal.
Activity Dependent Conductances: An “Emergent” Separation of Time-Scales David McLaughlin Courant Institute & Center for Neural Science New York University.
Mean Field Theories in Neuroscience B. Cessac, Neuromathcomp, INRIA.
Lecture 8: Integrate-and-Fire Neurons References: Dayan and Abbott, sect 5.4 Gerstner and Kistler, sects , 5.5, 5.6, H Tuckwell, Introduction.
1 Computational Vision CSCI 363, Fall 2012 Lecture 16 Stereopsis.
Two Mean Neuronal Waveforms Distribution of Spike Widths Interaction of Inhibitory and Excitatory Neurons During Visual Stimulation David Maher Department.
Nens220, Lecture 11 Introduction to Realistic Neuronal Networks John Huguenard.
Electrophysiology & fMRI. Neurons Neural computation Neural selectivity Hierarchy of neural processing.
Network Models (2) LECTURE 7. I.Introduction − Basic concepts of neural networks II.Realistic neural networks − Homogeneous excitatory and inhibitory.
Ko Youngkil.  Biological Evidence  5 prediction  compare with experimental results  Primary visual cortex (V1) is involved in image segmentation,
Biological Modeling of Neural Networks: Week 10 – Neuronal Populations Wulfram Gerstner EPFL, Lausanne, Switzerland 10.1 Cortical Populations - columns.
1 Computational Vision CSCI 363, Fall 2012 Lecture 2 Introduction to Vision Science.
Basics of Computational Neuroscience. What is computational neuroscience ? The Interdisciplinary Nature of Computational Neuroscience.
The Neural Code Baktash Babadi SCS, IPM Fall 2004.
Persistent activity and oscillations in recurrent neural networks in the high-conductance regime Rubén Moreno-Bote with Romain Brette and Néstor Parga.
42.13 Spike Pattern Distributions for Model Cortical Networks P-8
Dayan Abbot Ch 5.1-4,9.
Volume 36, Issue 5, Pages (December 2002)
Volume 82, Issue 1, Pages (April 2014)
Vision: In the Brain of the Beholder
Ian M. Finn, Nicholas J. Priebe, David Ferster  Neuron 
Orientation tuning: strongest response to one orientation
Carlos D. Brody, J.J. Hopfield  Neuron 
Brendan K. Murphy, Kenneth D. Miller  Neuron 
Ben Scholl, Xiang Gao, Michael Wehr  Neuron 
Adaptation without Plasticity
Nicholas J. Priebe, David Ferster  Neuron 
Prediction of Orientation Selectivity from Receptive Field Architecture in Simple Cells of Cat Visual Cortex  Ilan Lampl, Jeffrey S. Anderson, Deda C.
Strength and Orientation Tuning of the Thalamic Input to Simple Cells Revealed by Electrically Evoked Cortical Suppression  Sooyoung Chung, David Ferster 
Suppression without Inhibition in Visual Cortex
Yann Zerlaut, Alain Destexhe  Neuron 
Volume 64, Issue 6, Pages (December 2009)
Adaptation without Plasticity
Ilan Lampl, Iva Reichova, David Ferster  Neuron 
Melting the Iceberg: Contrast Invariance in Visual Cortex
Volume 27, Issue 2, Pages (August 2000)
Rony Azouz, Charles M. Gray  Neuron 
Dynamics of Orientation Selectivity in the Primary Visual Cortex and the Importance of Cortical Inhibition  Robert Shapley, Michael Hawken, Dario L. Ringach 
Presentation transcript:

Lecture 11: Networks II: conductance-based synapses, visual cortical hypercolumn model References: Hertz, Lerchner, Ahmadi, q-bio.NC/ [Erice lectures] Lerchner, Ahmadi, Hertz, q-bio.NC/ (Neurocomputing, 2004) [conductance-based synapses] Lerchner, Sterner, Hertz, Ahmadi, q-bio.NC/ [orientation hypercolumn model]

Conductance-based synapses In previous model:

Conductance-based synapses In previous model: But a synapse is a channel with a (neurotransmitter-gated) conductance:

Conductance-based synapses In previous model: But a synapse is a channel with a (neurotransmitter-gated) conductance:

Conductance-based synapses In previous model: But a synapse is a channel with a (neurotransmitter-gated) conductance: whereis the synaptically-filtered presynaptic spike train

Conductance-based synapses In previous model: But a synapse is a channel with a (neurotransmitter-gated) conductance: whereis the synaptically-filtered presynaptic spike train kernel:

Conductance-based synapses In previous model: But a synapse is a channel with a (neurotransmitter-gated) conductance: whereis the synaptically-filtered presynaptic spike train kernel:

Conductance-based synapses In previous model: But a synapse is a channel with a (neurotransmitter-gated) conductance: whereis the synaptically-filtered presynaptic spike train kernel:

Model

Mean field theory Effective single-neuron problem with synaptic input current

Mean field theory Effective single-neuron problem with synaptic input current

Mean field theory Effective single-neuron problem with synaptic input current with

Mean field theory Effective single-neuron problem with synaptic input current with where = correlation function of synaptically-filtered presynaptic spike trains

Balance condition Total mean current = 0 :

Balance condition Total mean current = 0 :

Balance condition Total mean current = 0 : Mean membrane potential just below 

Balance condition define Total mean current = 0 : Mean membrane potential just below 

Balance condition define Total mean current = 0 : Mean membrane potential just below 

Balance condition define Solve for r b as in current-based case: Total mean current = 0 : Mean membrane potential just below 

Balance condition define Solve for r b as in current-based case: Total mean current = 0 : Mean membrane potential just below 

Balance condition define Solve for r b as in current-based case:  Total mean current = 0 : Mean membrane potential just below 

High-conductance-state

V a “chases” V s a (t) at rate g tot (t)

High-conductance-state V a “chases” V s a (t) at rate g tot (t)

High-conductance-state V a “chases” V s a (t) at rate g tot (t)

High-conductance-state V a “chases” V s a (t) at rate g tot (t) Effective membrane time constant ~ 1 ms

Membrane potential and spiking dynamics for large g tot

Fluctuations Measure membrane potential from :

Fluctuations Measure membrane potential from :

Fluctuations Measure membrane potential from : Conductances: mean + fluctuations:

Fluctuations Measure membrane potential from : Conductances: mean + fluctuations:

Fluctuations Measure membrane potential from : Conductances: mean + fluctuations:

Fluctuations Measure membrane potential from : Use balance equation in Conductances: mean + fluctuations:

Fluctuations Measure membrane potential from : Use balance equation in Conductances: mean + fluctuations: =>

Fluctuations Measure membrane potential from : Use balance equation in Conductances: mean + fluctuations: => or

Fluctuations Measure membrane potential from : Use balance equation in Conductances: mean + fluctuations: => or with

Fluctuations Measure membrane potential from : Use balance equation in Conductances: mean + fluctuations: => or with

Effective current-based model High connectivity:

Effective current-based model High connectivity:

Effective current-based model High connectivity:

Effective current-based model High connectivity:

Effective current-based model High connectivity:

Effective current-based model High connectivity: Like current-based model with

Effective current-based model High connectivity: Like current-based model with (but effective membrane time constant depends on presynaptic rates)

Firing irregularity depends on reset level and  s

Modeling primary visual cortex

Background: 1.Neurons in primary visual cortex (area V1) respond strongly to oriented stimuli (bars, gratings)

Modeling primary visual cortex Background: 1.Neurons in primary visual cortex (area V1) respond strongly to oriented stimuli (bars, gratings)

Modeling primary visual cortex Background: 1.Neurons in primary visual cortex (area V1) respond strongly to oriented stimuli (bars, gratings) Note: contrast- invariant tuning width

Spatial organization of area V1 2. In V1, nearby neurons have similar orientation tuning

Spatial organization of area V1 2. In V1, nearby neurons have similar orientation tuning

Orientation column ~ 10 4 neurons that respond most strongly to a particular orientation

Orientation column ~ 10 4 neurons that respond most strongly to a particular orientation Tuning of input from LGN (Hubel-Wiesel):

Hubel-Wiesel feedforward connectivity cannot by itself explain contrast-invariant tuning Simplest model: cortical neurons sums H-W inputs, firing rate is threshold-linear function of sum

Hubel-Wiesel feedforward connectivity cannot by itself explain contrast-invariant tuning Simplest model: cortical neurons sums H-W inputs, firing rate is threshold-linear function of sum

Modeling a “hypercolumn” in V1 Coupled collection of networks, each representing an “orientation column”

Modeling a “hypercolumn” in V1 Coupled collection of networks, each representing an “orientation column”

Modeling a “hypercolumn” in V1 Coupled collection of networks, each representing an “orientation column”

Modeling a “hypercolumn” (2)

 0 is stimulus orientation

Modeling a “hypercolumn” (2)  0 is stimulus orientation (simplest model periodic in  with period  )

Modeling a “hypercolumn” (2)  0 is stimulus orientation (simplest model periodic in  with period  )

Modeling a “hypercolumn” (2)  0 is stimulus orientation (simplest model periodic in  with period  )

Modeling a “hypercolumn” (2)  0 is stimulus orientation Connection probability falls off with increasing  ’, reflecting probable greater distance. (simplest model periodic in  with period  )

Mean field theory Effective intracortical input current

Mean field theory Effective intracortical input current mean

Mean field theory Effective intracortical input current mean fluctuations:

Mean field theory Effective intracortical input current mean fluctuations: with

Mean field theory Effective intracortical input current mean fluctuations: with Solve self-consistently for order parameters

Balance condition Total mean current vanishes at all  :

Balance condition Total mean current vanishes at all  :

Balance condition Total mean current vanishes at all  : Ignore leak, make continuum approximation:

Balance condition Total mean current vanishes at all  : Ignore leak, make continuum approximation:

Balance condition Total mean current vanishes at all  : Ignore leak, make continuum approximation: Integral equations for r a (  )

Balance condition Total mean current vanishes at all  : Ignore leak, make continuum approximation: Integral equations for r a (  ) Can take  0 = 0

Broad tuning

Make ansatz

Broad tuning Make ansatz use

Broad tuning Make ansatz use

Broad tuning Make ansatz use => with

Broad tuning Make ansatz use => with Solve for Fourier components:

Broad tuning Make ansatz use => with Solve for Fourier components:

Broad tuning Make ansatz use => with Solve for Fourier components: Valid for (otherwise r a (  ) < 0 at large  )

 narrow tuning useonly for

 narrow tuning useonly for i.e.,

 narrow tuning useonly for i.e., same  c for both populations – consequence of

 narrow tuning useonly for i.e., same  c for both populations – consequence of same  for both populations in

 narrow tuning useonly for i.e., same  c for both populations – consequence of same  for both populations in and same  for all interactions in

 narrow tuning useonly for i.e., same  c for both populations – consequence of same  for both populations in and same  for all interactions in Balance condition:

 narrow tuning useonly for i.e., same  c for both populations – consequence of same  for both populations in and same  for all interactions in Balance condition: =>

Narrow tuning (2) Now do the integrals:

Narrow tuning (2) Now do the integrals:

Narrow tuning (2) Now do the integrals: where

Narrow tuning (2) Now do the integrals: where f0:f2:f0:f2: ______

Narrow tuning (3)

Divide one by the other:

Narrow tuning (3) Divide one by the other: determines  c

Narrow tuning (3) Divide one by the other: determines  c  c is independent of I a0 : contrast-invariant tuning width (as in experiments)

Narrow tuning (3) Divide one by the other: determines  c  c is independent of I a0 : contrast-invariant tuning width (as in experiments) Then can solve for rate components:

Narrow tuning (3) Divide one by the other: determines  c  c is independent of I a0 : contrast-invariant tuning width (as in experiments) Then can solve for rate components:

Noise tuning Input noise correlations:

Noise tuning Input noise correlations:

Noise tuning Input noise correlations:

Noise tuning Input noise correlations: =>

Noise tuning Input noise correlations: => Same integrals as in rate computation =>

Noise tuning Input noise correlations: => Same integrals as in rate computation =>

Noise tuning Input noise correlations: => Same integrals as in rate computation => using

Noise tuning Input noise correlations: => Same integrals as in rate computation => using=>

Noise tuning Input noise correlations: => Same integrals as in rate computation => using=>Same tuning as input!

Some numerical results (1)

Numerical results (2): Fano factor tuning

Numerical results (3): noise tuning vs firing tuning