Multiple attractors and transient synchrony in a model for an insect's antennal lobe Joint work with B. Smith, W. Just and S. Ahn.

Slides:



Advertisements
Similar presentations
Rhythms in the Nervous System : Synchronization and Beyond Rhythms in the nervous system are classified by frequency. Alpha 8-12 Hz Beta Gamma
Advertisements

Dynamic Causal Modelling for ERP/ERFs Valentina Doria Georg Kaegi Methods for Dummies 19/03/2008.
MATHEMATICAL MODELS OF SPIKE CODES AS DETERMINISTIC DYNAMICAL SYSTEMS Eleonora Catsigeras Universidad de la República Uruguay Neural Coding Montevideo.
It’s a Small World by Jamie Luo. Introduction Small World Networks and their place in Network Theory An application of a 1D small world network to model.
Lecture 12: olfaction: the insect antennal lobe References: H C Mulvad, thesis ( Ch 2http://
Gain control in insect olfaction for efficient odor recognition Ramón Huerta Institute for Nonlinear Science UCSD.
Spike Train Statistics Sabri IPM. Review of spike train  Extracting information from spike trains  Noisy environment:  in vitro  in vivo  measurement.
A model for spatio-temporal odor representation in the locust antennal lobe Experimental results (in vivo recordings from locust) Model of the antennal.
Thermodynamic and Statistical-Mechanical Measures for Synchronization of Bursting Neurons W. Lim (DNUE) and S.-Y. KIM (LABASIS)  Burstings with the Slow.
Rhythm sequence through the olfactory bulb layers during the time window of a respiratory cycle Buonviso, N., Amat, C., Litaudon, P., Roux, S., Royet,
WINNERLESS COMPETITION PRINCIPLE IN NEUROSCIENCE Mikhail Rabinovich INLS University of California, San Diego ’
Synchrony in Neural Systems: a very brief, biased, basic view Tim Lewis UC Davis NIMBIOS Workshop on Synchrony April 11, 2011.
Membrane capacitance Transmembrane potential Resting potential Membrane conductance Constant applied current.
Marseille, Jan 2010 Alfonso Renart (Rutgers) Jaime de la Rocha (NYU, Rutgers) Peter Bartho (Rutgers) Liad Hollender (Rutgers) Néstor Parga (UA Madrid)
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
Biological Modeling of Neural Networks: Week 11 – Continuum models: Cortical fields and perception Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Transients.
CH12: Neural Synchrony James Sulzer Background Stability and steady states of neural firing, phase plane analysis (CH6) Firing Dynamics (CH7)
Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding.
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
Dyskretne i niedyskretne modele sieci neuronów Winfried Just Department of Mathematics, Ohio University Sungwoo Ahn, Xueying Wang David Terman Department.
December 7, 2010Neural Networks Lecture 21: Hopfield Network Convergence 1 The Hopfield Network The nodes of a Hopfield network can be updated synchronously.
Connected Populations: oscillations, competition and spatial continuum (field equations) Lecture 12 Course: Neural Networks and Biological Modeling Wulfram.
Basal Ganglia. Involved in the control of movement Dysfunction associated with Parkinson’s and Huntington’s disease Site of surgical procedures -- Deep.
Dynamical Encoding by Networks of Competing Neuron Groups : Winnerless Competition M. Rabinovich 1, A. Volkovskii 1, P. Lecanda 2,3, R. Huerta 1,2, H.D.I.
The search for organizing principles of brain function Needed at multiple levels: synapse => cell => brain area (cortical maps) => hierarchy of areas.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
Burst Synchronization transition in neuronal network of networks Sun Xiaojuan Tsinghua University ICCN2010, Suzhou
Sparsely Synchronized Brain Rhythms in A Small-World Neural Network W. Lim (DNUE) and S.-Y. KIM (LABASIS)
What to make of: distributed representations summation of inputs Hebbian plasticity ? Competitive nets Pattern associators Autoassociators.
John Wordsworth, Peter Ashwin, Gabor Orosz, Stuart Townley Mathematics Research Institute University of Exeter.
Cycle 6: Oscillations and Synchrony
The Function of Synchrony Marieke Rohde Reading Group DyStURB (Dynamical Structures to Understand Real Brains)
Synchronization in complex network topologies
Entrainment of randomly coupled oscillator networks Hiroshi KORI Fritz Haber Institute of Max Planck Society, Berlin With: A. S. Mikhailov 
Sara A. Solla Northwestern University
Rhythms and Cognition: Creation and Coordination of Cell Assemblies Nancy Kopell Center for BioDynamics Boston University.
Effect of Small-World Connectivity on Sparsely Synchronized Cortical Rhythms W. Lim (DNUE) and S.-Y. KIM (LABASIS)  Fast Sparsely Synchronized Brain Rhythms.
Sensory Encoding of Smell in the Olfactory System of Drosophila
An Oscillatory Correlation Approach to Scene Segmentation DeLiang Wang The Ohio State University.
Ch 9. Rhythms and Synchrony 9.7 Adaptive Cooperative Systems, Martin Beckerman, Summarized by M.-O. Heo Biointelligence Laboratory, Seoul National.
NETWORK SONGS !! created by Carina Curto & Katherine Morrison January 2016 Input: a simple directed graph G satisfying two rules: 1. G is an oriented.
From LIF to HH Equivalent circuit for passive membrane The Hodgkin-Huxley model for active membrane Analysis of excitability and refractoriness using the.
Dynamic Causal Model for evoked responses in MEG/EEG Rosalyn Moran.
Multi-Electrode Arrays (MEAs) March 25, Introduction Multi-electrode Arrays, or MEAs, are quickly becoming a common tool to investigate patterns.
Network Models (2) LECTURE 7. I.Introduction − Basic concepts of neural networks II.Realistic neural networks − Homogeneous excitatory and inhibitory.
Spatiotemporal Networks in Addressable Excitable Media International Workshop on Bio-Inspired Complex Networks in Science and Technology Max Planck Institute.
What weakly coupled oscillators can tell us about networks and cells
Biointelligence Laboratory, Seoul National University
Intelligent Information System Lab
Collins Assisi, Mark Stopfer, Maxim Bazhenov  Neuron 
The temporal dynamics of switch between hippocampal attractor states
Smell (Olfaction): detection of Odorants
OCNC Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.
Neuro-RAM Unit in Spiking Neural Networks with Applications
Classification functions of chaotic pulse-coupled networks
Optimal Degrees of Synaptic Connectivity
Carlos D. Brody, J.J. Hopfield  Neuron 
Volume 30, Issue 2, Pages (May 2001)
Volume 40, Issue 6, Pages (December 2003)
Linking Memories across Time via Neuronal and Dendritic Overlaps in Model Neurons with Active Dendrites  George Kastellakis, Alcino J. Silva, Panayiota.
Moran Furman, Xiao-Jing Wang  Neuron 
Collins Assisi, Mark Stopfer, Maxim Bazhenov  Neuron 
How Does the Brain Smell?
Adult Neurogenesis in the Hippocampus: From Stem Cells to Behavior
Volume 44, Issue 1, Pages (September 2004)
Sparsely Synchronized Brain Rhythm in A Small-World Neural Network
Volume 30, Issue 2, Pages (May 2001)
Olfactory Information Processing in Drosophila
Volume 44, Issue 1, Pages (September 2004)
Presentation transcript:

Multiple attractors and transient synchrony in a model for an insect's antennal lobe Joint work with B. Smith, W. Just and S. Ahn

Olfaction

Schematic of the bee olfactory system Antennal lobe Input from receptors Output Local interneurons (LNs) Projection neurons (PNs) Glomeruli (glom): sites of synaptic contacts

Each olfactory sensory cell expresses one of ~200 receptors (~50000 sensory cells) Neural Coding in OB/AL Sensory cells that express the same receptor project to the same glomerulus Each odorant is represented by a unique combination of activated modules. Highly predictive relationship between molecules, neural responses and perception.

Data: spatial and temporal Orange oil Pentanol ImagingSingle cell/population Odorants with similar molecular structures activate overlapping areas et al., Nature 1997 Population activity exhibits approx. 30 Hz oscillations Individual cells exhibit transient synchronization (dynamic clustering) Different odors activate different areas of antennal lobe

PN’s respond differently to the same odor (Laurent, J. Neuro.‘96)

Transient Synchronization of Spikes (Laurent, TINS ‘96)

What is the role of transient synchrony? Is the entire sequence of dynamic clusters important? “ Decorrelation” of inputs (Laurent)

Neural activity patterns that represent odorants in the AL are statistically most separable at some point during the transient phase, well before they reach a final stable attractor. Transient phase may be more important than attractor. (Mazor, Laurant, Neuon 2005)

Goal: Construct an excitatory-inhibitory network that exhibits: Transient synchrony Large number of attractors/transients Decorrelation of inputs

The Model ASSUME: PN’s can excite one another - directly - via interneurons - via rebound

Transient: linear sequence of activation Period: stable, cyclic sequence of activation Reduction to discrete dynamics (1,6) (4,5) (2,3,7) (1,5,6) (2,4,7) (3,6) (1,4,5) Assume: A cell does not fire in consecutive episodes

(1,6) (4,5) (2,3,7) (1,5,6) (2,4,7) (3,6) (1,4,5) This solution exhibits transient synchrony 1 fires with 5 and 61 fires with 4 and 6 Discrete Dynamics

(1,6) (4,5) (2,3,7) (1,5,6) (2,4,7) (3,6) (1,4,5) Different transcient Same attractor (1,3,7) (4,5,6) Different transcient Different attractor Network Architecture (1,2,5) (4,6,7) (2,3,5) (1,6,7) (3,4,5) (1,2,7) (3,4,5,6)

What is the complete graph of the dynamics? How many attractors and transients are there? Network architecture

Analysis How do the - number of attractors - length of attractors - length of transients depend on network parameters including - network architecture - refractory period - threshold for firing ?

Numerics 2000 Number of attractors Number of connections per cell There is a “phase transition” at sparse coupling. -- There are a huge number of stable attractors if probability of coupling is sufficiently large

 = fraction of cells with refractory period 2 Length of transientsLength of attractors  =.5  = 0

Rigorous analysis 1)When can we reduce the differential equations model to the discrete model? 2) What can we prove about the discrete model?

Reducing the neuronal model to discrete dynamics Given integers n ( size of network ) and p ( refractory period ), can we choose intrinsic and synaptic parameters so that for any network architecture, every orbit of the discrete model can be realized by a stable solution of the neuronal model? Answer: - for purely inhibitory networks. No Yes - for excitatory-inhibitory networks.

100 Cells - Each cell connected to 9 cells Discrete modelODE model Cell number time

We have so far assumed that: If a cell fires then it must wait p episode before it can fire again. Threshold = 1 If a cell is ready to fire, then it will fire if it received input from at least one other active cell. We now assume that: refractory period of every cell = p i threshold for every cell =  i Refractory period = p Rigorous analysis of Discrete Dynamics

Question: How prevalent are minimal cycles? Does a randomly chosen state belong to a minimal cycle?

Need some notation: Example: Indegree of vertex 5 = 3 Outdegree of vertex 5 = 2 Let  (n) = probability of connection. The following result states that there is a “phase transition” when  (n) ~ ln(n) / n

Theorem 1: Let k(n) be any function such that k(n) - ln(n) / ln(2)   as n  . Let D n be any graph such that the indegree of every vertex is greater than k(n). Then the probability that a randomly chosen state lies in a minimal attractor  1 as n  . Theorem 2: Let k(n) be any function such that ln(n) / ln(2) - k(n)   as n  . Let D n be any graph such that both the indegree and the outdegree of every vertex is less than k(n). Then the probability that a randomly chosen state lies in a minimal attractor  0 as n  . A phase transition occurs when  (n) ~ ln n / n. The following result suggests  another phase transition ~ C/n.

Definition: Let s = [s 1, …., s n ] be a state. Then MC(s)  V D are those neurons i such that s i (t) is minimally cycling. That is, s i (0), s i (1), …, s i (t) cycles through {0, …., p i } MC = {5,7} MC = {4,7}

Theorem: Assume that each p i < p and  i < . Fix  (0,1). Then  C(p, , ) such that if  (n) > C/n, then with probability tending to one as n  , a randomly chosen state s will have MC(s) of size at least n. That is: Most states have a large set of minimally cycling nodes.