Bump attractors and the homogeneity assumption Kevin Rio NEUR 1680 28 April 2011.

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

Neural Network of the Cerebellum: Temporal Discrimination and the Timing of Responses Michael D. Mauk Dean V. Buonomano.
Purpose The aim of this project was to investigate receptive fields on a neural network to compare a computational model to the actual cortical-level auditory.
Impact of Correlated inputs on Spiking Neural Models Baktash Babadi Baktash Babadi School of Cognitive Sciences PM, Tehran, Iran PM, Tehran, Iran.
1 3. Spiking neurons and response variability Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Basic Models in Theoretical Neuroscience Oren Shriki 2010 Synaptic Dynamics 1.
Romain Brette Computational neuroscience of sensory systems Dynamics of neural excitability.
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
Biological Modeling of Neural Networks: Week 11 – Continuum models: Cortical fields and perception Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Transients.
Gain Modulation Huei-Ju Chen Papers: Chance, Abbott, and Reyes(2002) E. Salinas & T. Sejnowski(2001) E. Salinas & L.G. Abbott (1997, 1996) Pouget & T.
Reading population codes: a neural implementation of ideal observers Sophie Deneve, Peter Latham, and Alexandre Pouget.
Supervised and Unsupervised learning and application to Neuroscience Cours CA6b-4.
1 Part I Artificial Neural Networks Sofia Nikitaki.
Stable Propagation of Synchronous Spiking in Cortical Neural Networks Markus Diesmann, Marc-Oliver Gewaltig, Ad Aertsen Nature 402: Flavio Frohlich.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Basic Models in Neuroscience Oren Shriki 2010 Associative Memory 1.
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
A globally asymptotically stable plasticity rule for firing rate homeostasis Prashant Joshi & Jochen Triesch
Artificial Neural Networks Ch15. 2 Objectives Grossberg network is a self-organizing continuous-time competitive network.  Continuous-time recurrent.
How facilitation influences an attractor model of decision making Larissa Albantakis.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Connected Populations: oscillations, competition and spatial continuum (field equations) Lecture 12 Course: Neural Networks and Biological Modeling Wulfram.
Basic Models in Theoretical Neuroscience Oren Shriki 2010 Integrate and Fire and Conductance Based Neurons 1.
Mechanisms for phase shifting in cortical networks and their role in communication through coherence Paul H.Tiesinga and Terrence J. Sejnowski.
Biologically-Inspired Neural Nets Modeling the Hippocampus.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
Lecture 11: Networks II: conductance-based synapses, visual cortical hypercolumn model References: Hertz, Lerchner, Ahmadi, q-bio.NC/ [Erice lectures]
Neural coding (1) LECTURE 8. I.Introduction − Topographic Maps in Cortex − Synesthesia − Firing rates and tuning curves.
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Methods Neural network Neural networks mimic biological processing by joining layers of artificial neurons in a meaningful way. The neural network employed.
Lecture 9: Introduction to Neural Networks Refs: Dayan & Abbott, Ch 7 (Gerstner and Kistler, Chs 6, 7) D Amit & N Brunel, Cerebral Cortex 7, (1997)
Neural Modeling - Fall NEURAL TRANSFORMATION Strategy to discover the Brain Functionality Biomedical engineering Group School of Electrical Engineering.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Lecture 5 Neural Control
Computational Modeling of the Auditory Periphery:
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Biological Modeling of Neural Networks: Week 15 – Fast Transients and Rate models Wulfram Gerstner EPFL, Lausanne, Switzerland 15.1 Review Populations.
CHEE825 Fall 2005J. McLellan1 Nonlinear Empirical Models.
Alternating and Synchronous Rhythms in Reciprocally Inhibitory Model Neurons Xiao-Jing Wang, John Rinzel Neural computation (1992). 4: Ubong Ime.
Network Models (2) LECTURE 7. I.Introduction − Basic concepts of neural networks II.Realistic neural networks − Homogeneous excitatory and inhibitory.
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Biological Modeling of Neural Networks: Week 10 – Neuronal Populations Wulfram Gerstner EPFL, Lausanne, Switzerland 10.1 Cortical Populations - columns.
IJCNN, July 27, 2004 Extending SpikeProp Benjamin Schrauwen Jan Van Campenhout Ghent University Belgium.
The Neural Code Baktash Babadi SCS, IPM Fall 2004.
Persistent activity and oscillations in recurrent neural networks in the high-conductance regime Rubén Moreno-Bote with Romain Brette and Néstor Parga.
How Neurons Do Integrals
? Dynamical properties of simulated MEG/EEG using a neural mass model
Computational neuroscience
Brendan K. Murphy, Kenneth D. Miller  Neuron 
Volume 86, Issue 1, Pages (April 2015)
Volume 66, Issue 4, Pages (May 2010)
Thomas Akam, Dimitri M. Kullmann  Neuron 
H.Sebastian Seung, Daniel D. Lee, Ben Y. Reis, David W. Tank  Neuron 
Yann Zerlaut, Alain Destexhe  Neuron 
Information Processing by Neuronal Populations Chapter 5 Measuring distributed properties of neural representations beyond the decoding of local variables:
Xiangying Meng, Joseph P.Y. Kao, Hey-Kyoung Lee, Patrick O. Kanold 
Heterogeneous convolutional neural networks for visual recognition
Robust Spatial Working Memory through Homeostatic Synaptic Scaling in Heterogeneous Cortical Networks  Alfonso Renart, Pengcheng Song, Xiao-Jing Wang 
Cellular Mechanisms Underlying Stimulus-Dependent Gain Modulation in Primary Visual Cortex Neurons In Vivo  Jessica A. Cardin, Larry A. Palmer, Diego.
Volume 74, Issue 1, Pages (April 2012)
Rapid Neocortical Dynamics: Cellular and Network Mechanisms
Shunting Inhibition Modulates Neuronal Gain during Synaptic Excitation
Robust Spatial Working Memory through Homeostatic Synaptic Scaling in Heterogeneous Cortical Networks  Alfonso Renart, Pengcheng Song, Xiao-Jing Wang 
Presentation transcript:

Bump attractors and the homogeneity assumption Kevin Rio NEUR April 2011

Bump attractors Can explain sustained (but bounded) activity in populations of neurons. A useful description of working memory by combining self-sustained activity with sensitivity to external inputs.

Firing-Rate Model τ r time constant r i (t) firing rate of neuron i I i (t) input current for neuron i J ij synaptic strength between neurons i,j Θcurrent bias

Firing-Rate Model 0 x if x > 0 0 if x < 0 J ij = -J 0 + J 2 cos (2π(i-j)/N)

Homogeneity Homogeneity is required to create a continuum of bump attractors. –Even a small amount of heterogeneity destroys continuum, leaving only a few discrete attractors. Biologically implausible: number and strength of synaptic connections is variable.

Solutions Fine tuning properties of each neuron. Network learns to tune itself through an activity-dependent mechanism. –“Activity-dependent scaling of synaptic weights, which up- or downregulates excitatory inputs so that the long term average firing rate is similar for each neuron” (Renart, Song, Wang 2003).

Synaptic Scaling (Renart, Song, Wang 2003) τ g time constant [large] g(θ) factor that multiplies excitatory synaptic conductances to neuron θ r(θ) instantaneous firing rate of neuron θ r tg (θ) target firing rate of neuron θ

Project Outline 1.Simulate network of firing-rate neurons. 2.Observe bump attractors. 3.Show how loss of symmetry destroys continuum of bump attractors. 4.Restore symmetry by activity-dependent scaling of synaptic weights.

Sequential Modeling in the Auditory System By Rohan Ramesh and Srihari Sritharan

Recurrent Networks Three layer network with feedback: input layer, intermediate layer, and output layer Intermediate layer in this instance are SAM cells

Spike Accumulation Model (SAM) Variable of importance – the accumulated potential of SAM cells in intermediate layer –The accumulated potential = a constantly- updating characterization of a constant stream of sensory input

Membrane Time Constant Sequential memory dependent on membrane time constant (τ) of SAM cells –Different response to stimuli 1  2 than 2  1 due to difference in neuronal response –Activity of the entire population of SAM cells important

Output Sequential learning decoded due to input from SAM cells AND efferent copy of activity of output layer to intermediate layer –How do we decode this? How does the efferent feedback loop aid decoding? –Activation of output cell:

Learning How will the network learn a series of tones? The input to the SAM cells are determined by a comparison of the output of the output cells to the output of the input cells

Encode Specificity and Timing Specificity – tone 1 vs. tone 2 –Auditory tuning within SAM cells could determine the frequency of the input Timing - tone 1  tone 2 OR tone 2  tone 1 –The membrane time constant and the efferent copy of the output will help determine the order of the tones

Auditory Input

Auditory Tuning Curves

Supervised learning with lateral interaction and back propagation within a neural network 4/27/11 Sunmee Park, Jing Wang, Rizwan Huq Computational neuroscience 2011

Motivation Natural vision can differentiate various handwritten digits Can we mimic the vision system? (Of course, in a simpler way) Supervised learning: given input/output –Traditional backpropagation neural network(NN) –Biophysically appropriate inter-layer communications

Network diagram

Methods … -Hand-written digits set from 0~ 9, collected by USPS/NIST database - 8 bit grayscale images (1100 examples of each class) Input dataset

Preprocessing Applying Gabor filter

Deliverables A simulated neural system capable of decoding handwritten image data and identifying the input digit. System performance curves, for which we will reserve training examples. Analysis of the impact inter-layer communications.

Kuhn, et al. 2004: Neuronal Integration of Synaptic Input in the Fluctuation-Driven Regime Project Team: Tommy Tea and Christina Hahn

Paper Recap Visual synaptic bombardment leads to changes in conductance. This in turn increases fluctuations in membrane potential and these fluctuations modify the firing rate. Firing rate decreases because of shunted membrane potential fluctuations, and increases because of shorter membrane time constants, allowing for faster membrane potential fluctuations.

Methods Transient current-based model: Transient conductance-based model: Firing Rate model:

Figures of model with current input increasing monotonically

Figures of model with non-monotonic conductance input:

Physiological Significance In the conductance-based model, the fluctuations are expected to reach a maximum standard deviation of ~3 mV, which is within the range of values observed in vivo