Jochen Triesch, UC San Diego, 1 Local Stability Analysis Step One: find stationary point(s) Step Two: linearize around.

Slides:



Advertisements
Similar presentations
Ch 7.6: Complex Eigenvalues
Advertisements

Neural Network Models in Vision Peter Andras
Gabor Filter: A model of visual processing in primary visual cortex (V1) Presented by: CHEN Wei (Rosary) Supervisor: Dr. Richard So.
A Neural Model for Detecting and Labeling Motion Patterns in Image Sequences Marc Pomplun 1 Julio Martinez-Trujillo 2 Yueju Liu 2 Evgueni Simine 2 John.
Ch 9.4: Competing Species In this section we explore the application of phase plane analysis to some problems in population dynamics. These problems involve.
WINNERLESS COMPETITION PRINCIPLE IN NEUROSCIENCE Mikhail Rabinovich INLS University of California, San Diego ’
Lesions of Retinostriate Pathway Lesions (usually due to stroke) cause a region of blindness called a scotoma Identified using perimetry note macular sparing.
Test on Friday!. Lesions of Retinostriate Pathway Lesions (usually due to stroke) cause a region of blindness called a scotoma Identified using perimetry.
Network of Neurons Computational Neuroscience 03 Lecture 6.
Reading population codes: a neural implementation of ideal observers Sophie Deneve, Peter Latham, and Alexandre Pouget.
Searching for the NCC We can measure all sorts of neural correlates of these processes…so we can see the neural correlates of consciousness right? So what’s.
Jochen Triesch, UC San Diego, 1 Pattern Formation in Neural Fields Goal: Understand how non-linear recurrent dynamics can.
Homework 4, Problem 3 The Allee Effect. Homework 4, Problem 4a The Ricker Model.
Neuronal excitability from a dynamical systems perspective Mike Famulare CSE/NEUBEH 528 Lecture April 14, 2009.
Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding.
Jochen Triesch, UC San Diego, 1 Review of Dynamical Systems Natural processes unfold over time: swinging of a pendulum.
Segmentation and Grouping
A globally asymptotically stable plasticity rule for firing rate homeostasis Prashant Joshi & Jochen Triesch
Overview 1.The Structure of the Visual Cortex 2.Using Selective Tuning to Model Visual Attention 3.The Motion Hierarchy Model 4.Simulation Results 5.Conclusions.
Artificial Neural Networks Ch15. 2 Objectives Grossberg network is a self-organizing continuous-time competitive network.  Continuous-time recurrent.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Connected Populations: oscillations, competition and spatial continuum (field equations) Lecture 12 Course: Neural Networks and Biological Modeling Wulfram.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Analysis of the Rossler system Chiara Mocenni. Considering only the first two equations and ssuming small z The Rossler equations.
Mechanisms for phase shifting in cortical networks and their role in communication through coherence Paul H.Tiesinga and Terrence J. Sejnowski.
Unsupervised learning
Another viewpoint: V1 cells are spatial frequency filters
Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,
Low Level Visual Processing. Information Maximization in the Retina Hypothesis: ganglion cells try to transmit as much information as possible about the.
1 Computational Vision CSCI 363, Fall 2012 Lecture 3 Neurons Central Visual Pathways See Reading Assignment on "Assignments page"
Ch 9.5: Predator-Prey Systems In Section 9.4 we discussed a model of two species that interact by competing for a common food supply or other natural resource.
Deriving connectivity patterns in the primary visual cortex from spontaneous neuronal activity and feature maps Barak Blumenfeld, Dmitri Bibitchkov, Shmuel.
Correlation-Induced Oscillations in Spatio-Temporal Excitable Systems Andre Longtin Physics Department, University of Ottawa Ottawa, Canada.
The Ring Model of Cortical Dynamics: An overview David Hansel Laboratoire de Neurophysique et Physiologie CNRS-Université René Descartes, Paris, France.
Chapter 7. Network models Firing rate model for neuron as a simplification for network analysis Neural coordinate transformation as an example of feed-forward.
Simplified Models of Single Neuron Baktash Babadi Fall 2004, IPM, SCS, Tehran, Iran
Chapter 3: Neural Processing and Perception. Neural Processing and Perception Neural processing is the interaction of signals in many neurons.
Biological Modeling of Neural Networks: Week 12 – Decision models: Competitive dynamics Wulfram Gerstner EPFL, Lausanne, Switzerland 12.1 Review: Population.
The Function of Synchrony Marieke Rohde Reading Group DyStURB (Dynamical Structures to Understand Real Brains)
Human vision Jitendra Malik U.C. Berkeley. Visual Areas.
Lecture 21 Neural Modeling II Martin Giese. Aim of this Class Account for experimentally observed effects in motion perception with the simple neuronal.
Synchronization in complex network topologies
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Effect of Small-World Connectivity on Sparsely Synchronized Cortical Rhythms W. Lim (DNUE) and S.-Y. KIM (LABASIS)  Fast Sparsely Synchronized Brain Rhythms.
Analysis of the Rossler system Chiara Mocenni. Considering only the first two equations and assuming small z, we have: The Rossler equations.
Ch 9. Rhythms and Synchrony 9.7 Adaptive Cooperative Systems, Martin Beckerman, Summarized by M.-O. Heo Biointelligence Laboratory, Seoul National.
1 Computational Vision CSCI 363, Fall 2012 Lecture 16 Stereopsis.
Network Models (2) LECTURE 7. I.Introduction − Basic concepts of neural networks II.Realistic neural networks − Homogeneous excitatory and inhibitory.
Biological Modeling of Neural Networks: Week 10 – Neuronal Populations Wulfram Gerstner EPFL, Lausanne, Switzerland 10.1 Cortical Populations - columns.
Robot Intelligence Technology Lab. 10. Complex Hardware Morphologies: Walking Machines Presented by In-Won Park
Bayesian Perception.
Jochen Triesch, UC San Diego, 1 Part 3: Hebbian Learning and the Development of Maps Outline: kinds of plasticity Hebbian.
Biointelligence Laboratory, Seoul National University
Neural Oscillations Continued
One- and Two-Dimensional Flows
Presented by Rhee, Je-Keun
Computational models of epilepsy
Volume 36, Issue 5, Pages (December 2002)
OCNC Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.
Brendan K. Murphy, Kenneth D. Miller  Neuron 
Volume 40, Issue 6, Pages (December 2003)
Binocular Disparity and the Perception of Depth
Volume 36, Issue 5, Pages (December 2002)
Nicholas J. Priebe, David Ferster  Neuron 
Volume 64, Issue 6, Pages (December 2009)
Neural Network Models in Vision
Strategic Communications at TRIUMF
Rapid Neocortical Dynamics: Cellular and Network Mechanisms
Presentation transcript:

Jochen Triesch, UC San Diego, 1 Local Stability Analysis Step One: find stationary point(s) Step Two: linearize around all stationary points (using Taylor expansion), the Eigenvalues of the linearized problem determine nature of stationary point: Real parts: positive: growth of fluctuations, instability negative: decay of fluctuations, stability Imaginary parts: if present, solutions are oscillatory (spiraling) spiraling inward or outward if non-zero real parts Overall: point (asymptotically) stable if all real parts negative

Jochen Triesch, UC San Diego, 2 Examples of nonlinear activation functions (transfer functions): a. b. c. c. rectified hyperbolic tangent b. “sigmoidal function” a. “half-wave rectification” Note: we will typically consider the activation function as a fixed property of our model neurons but real neurons can change their intrinsic properties.

Jochen Triesch, UC San Diego, 3 The Naka-Rushton function P ½, the “semi-saturation”, is the stimulus contrast (intensity) that produces half of the maximum firing rate r max. N determines the slope of the non-linearity at P ½. A good fit for the steady state firing rate of neurons in several visual areas (LGN, V1, middle temporal) in response to a visual stimulus of contrast P is given by: Albrecht and Hamilton (1982)

Jochen Triesch, UC San Diego, 4 Interaction of Excitatory and Inhibitory Neuronal Populations M EE vEvE M IE M EI Dale’s law: every neuron is either excitatory or inhibitory, never both Motivations: understand the emergence of oscillations in excitatory-inhibitory networks learn about local stability analysis Consider 2 populations of excitatory and inhibitory neurons with firing rates v: vIvI

Jochen Triesch, UC San Diego, 5 Parameters: M EE = 1.25, M EI = -1, gamma E = -10Hz, tau E = 10ms M II = 0, M IE = 1, gamma I = 10 Hz, tau I = varying M EE vEvE vIvI M IE M EI [ ] + Mathematical formulation: Stationary point:

Jochen Triesch, UC San Diego, 6 stationary point *nullclines, zero-isoclines * * Phase Portrait A: Stationary point is intersection of the nullclines. Arrows indicate direction of flow in different area of the phase space (state space). B: real and imaginary part of Eigenvalue as a function of tau I.

Jochen Triesch, UC San Diego, 7 Linearization around stationary point gives the following matrix A with these Eigenvalues:

Jochen Triesch, UC San Diego, 8 For tau I below critical value of 40ms, Eigenvalues have negative real parts: we see damped oscillations. Trajectory spirals to stable fixed point

Jochen Triesch, UC San Diego, 9 When tau I grows beyond critical value of 40ms, a Hopf bifurcation occurs (here tau I= 50ms): stable fixed point → unstable fixed point + limit cycle Here, the amplitude of the oscillation grows until the non-linearity “clips” it.

Jochen Triesch, UC San Diego, 10 Neural Oscillations interaction of excitatory and inhibitory neuron populations can lead to oscillations very important in, e.g. locomotion: rhythmic walking and swimming motions: Central Pattern Generators (CPGs) also very important in olfactory system (selective amplification) also oscillations in visual system: functional role hotly debated. Proposed as solution to binding problem: Idea: neural populations that represent features of the same object synchronize their firing

Jochen Triesch, UC San Diego, 11 Binding Problem what and where (how) pathways in visual system how do you know what is where? circle triangle up down visual field neural representation Synchronization no yes spike trains

Jochen Triesch, UC San Diego, 12 Competition and Decisions Motivation: ability to decide between alternatives is fundamental Idea: inhibitory interaction between neuronal populations representing different alternatives is plausible candidate mechanism The most simple system: Winner-take-all (WTA) network K1K1 K2K2

Jochen Triesch, UC San Diego, 13 Stationary States and Stability The stationary states for K 1 =K 2 =120: e 1 = 50, e 2 = 0 e 2 = 50, e 1 = 0 e 1 = e 2 = 20 Linear stability analysis: 1) for e 1 = 50, e 2 = 0 : 2) for e 1 = e 2 = 20 : (τ=20ms) → “stable node” → “unstable saddle”

Jochen Triesch, UC San Diego, 14 Matlab Simulation one unit wins the competition and completely suppresses the other Behavior for strong identical input: K 1 =K 2 =K=120

Jochen Triesch, UC San Diego, 15 Continuous Neural Fields So far: individual units, with specific connectivity patterns Idea: abstract from individual neurons to continuous fields of neurons, where synaptic weights patterns become homogeneous interaction kernels Variant 1: continuous labeling of input or output domain Variant 2: continuous labeling of two- dimensional cortical space

Jochen Triesch, UC San Diego, 16 Recurrent Simple Cell Model Question: how is orientation selectivity achieved? (feedforward vs. recurrent accounts)

Jochen Triesch, UC San Diego, 17 Classic Hubel and Wiesel Model simple cell sums input from geniculate On and Off cells in particular constellation complex cell sums inputs from simple cells with same orientation but different phase preference

Jochen Triesch, UC San Diego, 18 Recurrent Model Stimulus with orientation angle θ=0. A: amplitude, c: contrast, ε: small nonlinear amplification

Jochen Triesch, UC San Diego, 19 Superior Colliculus and Saccades Representation of saccade motor command in superior colliculus: vector averaging Yarbus

Jochen Triesch, UC San Diego, 20 A Simple Model of Saccade Target Selection Question: how do you select the target of your next saccade? Idea: competitive “blob” dynamics in 2 dimensional “neural field” layer of non-linear units with local excitation linear unit for global inhibition

Jochen Triesch, UC San Diego, 21 Stability Analysis of Saccade Model Step 1: look for homogeneous stationary solutions Step 2: find range of β for which homogeneous stationary solution becomes unstable Step 3: simulate system (Matlab), observe behavior Step 4: estimate the size of the resulting blob as a function of β Reminder: Convolution

Jochen Triesch, UC San Diego, 22 Example Run Initialization: 10 random spots of small activity, I=0, η small Gaussian iid noise time Result: a blob of activity forms at location determined by initial state and noise

Jochen Triesch, UC San Diego, 23 Results of Analysis Step 1: look for homogeneous stationary solutions h 0 =0 works, β>1/A prevents fully active layer (A=area of layer) Step 2: find range of β for which homogeneous stationary solution becomes unstable for small local fluctuation from h 0 =0 to grow, we need β<1/2πσ 2 Step 3: simulate system (Matlab), observe behavior formation of single blob of activity suppressing all other activity in layer Step 4: estimate the size of the resulting blob as a function of β, σ

Jochen Triesch, UC San Diego, 24 Matlab Code Fragments % initialize layer size = 50; h = zeros(size,size); for i=1:10 x = unidrnd(size); y = unidrnd(size); h(x,y)=h(x,y)+0.05; end % main loop while(1) active = (h>0); I = conv2(active, g, 'same') - beta*(sum(sum(active))); h = (1-alpha)*h + alpha*I + normrnd(0, noise, size, size); % display plots, etc. pause end % initialize layer size = 50; h = zeros(size,size); for i=1:10 x = unidrnd(size); y = unidrnd(size); h(x,y)=h(x,y)+0.05; end % main loop while(1) active = (h>0); I = conv2(active, g, 'same') - beta*(sum(sum(active))); h = (1-alpha)*h + alpha*I + normrnd(0, noise, size, size); % display plots, etc. pause end

Jochen Triesch, UC San Diego, 25 Discussion of Saccade Model Positive: roughly consistent with anatomy/physiology explains how several close-by targets can win over strong but isolated target suggests why time to decision is longer in situations with several equally strong targets similar models used in modeling human performance in visual search tasks Limitations: only qualitative account in order to make precise quantitative predictions, it is typically necessary to take more physiological details into account, which are mostly unknown: exact connectivity patterns non-linearities more than one area is involved what are all the inputs?

Jochen Triesch, UC San Diego, 26 Connection to Maximum Likelihood Estimation So far: purely bottom-up view: networks with this connectivity structure just happen to exhibit this behavior and this may be analogous to what the brain does New idea: use such dynamics to do Maximum Likelihood estimation Want: New idea: blob dynamics + vector decoding works better than doing direct vector decoding on the noisy inputs 1-d “blob” network with noisy input r: firing rate vector, Θ: stimulus parameter Population vector decoding: where c a is the preferred stimulus vector for unit a

Jochen Triesch, UC San Diego, 27 Binocular Rivalry, Bistable Percepts Idea: extend WTA network by slow adaptation mechanism. Adaptation acts to increase semi- saturation of Naka Rushton non- linearity ambiguous figure binocular rivalry LR

Jochen Triesch, UC San Diego, 28 Matlab Simulation β=1.5

Jochen Triesch, UC San Diego, 29 Discussion of Rivalry Model Positive: roughly consistent with anatomy/physiology offers parsimonious mechanism for different perceptual switching phenomena, in a sense it “unifies” different phenomena by explaining them with the same mechanism Limitations: provides only qualitative account real switching behaviors are not so nice and regular and simple: cycles of different durations temporal asymmetries rivalry: competition likely takes place in hierarchical network rather than in just one stage. spatial dimension was ignored