NOISE and DELAYS in NEUROPHYSICS Andre Longtin Center for Neural Dynamics and Computation Department of Physics Department of Cellular and Molecular Medicine.

Slides:



Advertisements
Similar presentations
What is the neural code? Puchalla et al., What is the neural code? Encoding: how does a stimulus cause the pattern of responses? what are the responses.
Advertisements

Sensorimotor Transformations Maurice J. Chacron and Kathleen E. Cullen.
Spike Train Statistics Sabri IPM. Review of spike train  Extracting information from spike trains  Noisy environment:  in vitro  in vivo  measurement.
LFPs 1: Spectral analysis Kenneth D. Harris 11/2/15.
An Introductory to Statistical Models of Neural Data SCS-IPM به نام خالق ناشناخته ها.
Neurophysics Part 1: Neural encoding and decoding (Ch 1-4) Stimulus to response (1-2) Response to stimulus, information in spikes (3-4) Part 2: Neurons.
What is the language of single cells? What are the elementary symbols of the code? Most typically, we think about the response as a firing rate, r(t),
1 Correlations Without Synchrony Presented by: Oded Ashkenazi Carlos D. Brody.
1 3. Spiking neurons and response variability Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
1 Eigenmannia: Glass Knife Fish A Weakly Electric Fish Electrical organ discharges (EODs) – Individually fixed between 250 and 600 Hz –Method of electrolocation.
Point process and hybrid spectral analysis.
Neural Coding 4: information breakdown. Multi-dimensional codes can be split in different components Information that the dimension of the code will convey.
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
Stable Propagation of Synchronous Spiking in Cortical Neural Networks Markus Diesmann, Marc-Oliver Gewaltig, Ad Aertsen Nature 402: Flavio Frohlich.
Spike-triggering stimulus features stimulus X(t) multidimensional decision function spike output Y(t) x1x1 x2x2 x3x3 f1f1 f2f2 f3f3 Functional models of.
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
Part I: Single Neuron Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 2-5 Laboratory of Computational.
Statistical analysis and modeling of neural data Lecture 4 Bijan Pesaran 17 Sept, 2007.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Basic Models in Theoretical Neuroscience Oren Shriki 2010 Integrate and Fire and Conductance Based Neurons 1.
Biological Modeling of Neural Networks: Week 15 – Population Dynamics: The Integral –Equation Approach Wulfram Gerstner EPFL, Lausanne, Switzerland 15.1.
LFPs Kenneth D. Harris 11/2/15. Local field potentials Slow component of intracranial electrical signal Physical basis for scalp EEG.
Extracting Time and Space Scales with Feedback and Nonlinearity André Longtin Physics + Cellular and Molecular Medicine CENTER FOR NEURAL DYNAMICS UNIVERSITY.
Biological Modeling of Neural Networks: Week 9 – Adaptation and firing patterns Wulfram Gerstner EPFL, Lausanne, Switzerland 9.1 Firing patterns and adaptation.
1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
Biological Modeling of Neural Networks Week 8 – Noisy input models: Barrage of spike arrivals Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
Neural coding (1) LECTURE 8. I.Introduction − Topographic Maps in Cortex − Synesthesia − Firing rates and tuning curves.
Correlation-Induced Oscillations in Spatio-Temporal Excitable Systems Andre Longtin Physics Department, University of Ottawa Ottawa, Canada.
Design constraints for an active sensing system Insights from the Electric Sense Mark E. Nelson Beckman Institute Univ. of Illinois, Urbana-Champaign.
Projects: 1.Predictive coding in balanced spiking networks (Erwan Ledoux). 2.Using Canonical Correlation Analysis (CCA) to analyse neural data (David Schulz).
Signals CY2G2/SE2A2 Information Theory and Signals Aims: To discuss further concepts in information theory and to introduce signal theory. Outcomes:
An Analysis of Phase Noise and Fokker-Planck Equations Hao-Min Zhou School of Mathematics Georgia Institute of Technology Partially Supported by NSF Joint.
Stochastic Dynamics & Small Networks Farzan Nadim.
Neural codes and spiking models. Neuronal codes Spiking models: Hodgkin Huxley Model (small regeneration) Reduction of the HH-Model to two dimensions.
Biological Modeling of Neural Networks Week 8 – Noisy output models: Escape rate and soft threshold Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
TEMPLATE DESIGN © In analyzing the trajectory as time passes, I find that: The trajectory is trying to follow the moving.
Study on synchronization of coupled oscillators using the Fokker-Planck equation H.Sakaguchi Kyushu University, Japan Fokker-Planck equation: important.
1 Chapter 9 Detection of Spread-Spectrum Signals.
Theoretical Neuroscience Physics 405, Copenhagen University Block 4, Spring 2007 John Hertz (Nordita) Office: rm Kc10, NBI Blegdamsvej Tel (office)
1 3. Simplified Neuron and Population Models Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Andre Longtin Physics Department University of Ottawa Ottawa, Canada Effects of Non-Renewal Firing on Information Transfer in Neurons.
BCS547 Neural Decoding. Population Code Tuning CurvesPattern of activity (r) Direction (deg) Activity
Biological Neural Network & Nonlinear Dynamics Biological Neural Network Similar Neural Network to Real Neural Networks Membrane Potential Potential of.
Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses Jonathan Pillow HHMI and NYU Oct 5, Course.
Biological Modeling of Neural Networks Week 7 – Variability and Noise: The question of the neural code Wulfram Gerstner EPFL, Lausanne, Switzerland 7.1.
Review – Objectives Transitioning 4-5 Spikes can be detected from many neurons near the electrode tip. What are some ways to determine which spikes belong.
6. Population Codes Presented by Rhee, Je-Keun © 2008, SNU Biointelligence Lab,
ECE-7000: Nonlinear Dynamical Systems 2. Linear tools and general considerations 2.1 Stationarity and sampling - In principle, the more a scientific measurement.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
Ch 3. Likelihood Based Approach to Modeling the Neural Code Bayesian Brain: Probabilistic Approaches to Neural Coding eds. K Doya, S Ishii, A Pouget, and.
From LIF to HH Equivalent circuit for passive membrane The Hodgkin-Huxley model for active membrane Analysis of excitability and refractoriness using the.
Biological Modeling of Neural Networks Week 10 – Variability and Noise: The question of the neural code Wulfram Gerstner EPFL, Lausanne, Switzerland 10.1.
The Neural Code Baktash Babadi SCS, IPM Fall 2004.
Biological Modeling of Neural Networks Week 11 – Variability and Noise: Autocorrelation Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Variation of.
Effects of noisy adaptation on neural spiking statistics Tilo Schwalger,Benjamin Lindner, Karin Fisch, Jan Benda Max-Planck Institute for the Physics of.
Mechanisms of Simple Perceptual Decision Making Processes
What does the synapse tell the axon?
Synaptic Dynamics: Unsupervised Learning
Jingkui Wang, Marc Lefranc, Quentin Thommen  Biophysical Journal 
The Brain as an Efficient and Robust Adaptive Learner
Spatiotemporal Response Properties of Optic-Flow Processing Neurons
Confidence as Bayesian Probability: From Neural Origins to Behavior
Volume 36, Issue 5, Pages (December 2002)
Woochang Lim1 and Sang-Yoon Kim2
Volume 32, Issue 1, Pages (October 2001)
Firing Rates and Spike Statistics
The Brain as an Efficient and Robust Adaptive Learner
Shunting Inhibition Modulates Neuronal Gain during Synaptic Excitation
Rony Azouz, Charles M. Gray  Neuron 
Presentation transcript:

NOISE and DELAYS in NEUROPHYSICS Andre Longtin Center for Neural Dynamics and Computation Department of Physics Department of Cellular and Molecular Medicine UNIVERSITY OF OTTAWA, Canada

OUTLINE Modeling Single Neuron noise Modeling Single Neuron noise leaky integrate and fire leaky integrate and fire quadratic integrate and fire quadratic integrate and fire “transfer function” approach “transfer function” approach Modeling response to signals Modeling response to signals Information theory Information theory Delayed dynamics Delayed dynamics

MOTIVATION for STUDYING NOISE

“Noise” in the neuroscience literature As an input with many frequency components over a particular band, of similar amplitudes, and scattered phases As the resulting current from the integration of many independent, excitatory and inhibitory synaptic events at the soma As the maintained discharge of some neurons As « cross-talk » responses from indirectly stimulated neurons As « internal », resulting from the probabilistic gating of voltage- dependent ion channels As « synaptic », resulting from the stochastic nature of vesicle release at the synaptic cleft Segundo et al., Origins and Self Organization, 1994

Leaky Integrate-and-fire with + and - Feedback f = firing rate function

Firing Rate Functions Or stochastic: Noise free:

Noise induced Stochastic Gain Control Resonance

For Poisson input (Campbell’s theorem): mean conductance ~ mean input rate standard deviation σ ~ sqrt(mean rate)

NOISE smoothes out f-I curves

WHAT QUADRATIC INTEGRATE-AND-FIRE MODEL? Technically more difficult Technically more difficult Which variable to use? Which variable to use? On the real line? On the real line? On a circle? On a circle?

Information-theoretic approaches Linear encoding versus nonlinear processing Linear encoding versus nonlinear processing Rate code, long time constant, integrator Rate code, long time constant, integrator Time code, small time constant, coincidence detector (reliability) Time code, small time constant, coincidence detector (reliability) Interspike interval code (ISI reconstruction) Interspike interval code (ISI reconstruction) Linear correlation coefficient Linear correlation coefficient Coherence Coherence Coding fraction Coding fraction Mutual information Mutual information Challenge: Biophysics of coding Challenge: Biophysics of coding Forget the biophysics? Use better (mesoscopic ?) variables? Forget the biophysics? Use better (mesoscopic ?) variables?

Neuroscience101 (Continued): Number of spikes In time interval T: Spike train: Raster Plot: Random variables Interspike Intervals (ISI):

Information Theoretic Calculations: Gaussian Noise Stimulus S Spike Train X Neuron ??? Coherence Function: Mutual Information Rate:

Stimulus Protocol: Study effect of  (stimulus contrast) and f c (stimulus bandwidth) on coding.

Information Theory:

Linear Response Calculation for Fourier transform of spike train: susceptibility unperturbed spike train Spike Train Spec = Background Spec + (transfer function*Signal Spec)

CV == INTERVAL mean / INTERVAL Standard deviation

Wiener Khintchine Power spectrum Autocorrelation Integral of S over all frequencies = C(0) = signal variance Integral of C over all time lags = S(0) = signal intensity

Signal Detection Theory: ROC curve:

Information Theory Actual signal Reconstructed signal The stimulus can be well characterized (electric field). This allows for detailed signal processing analysis. Gabbiani et al., Nature (1996) 384: Bastian et al., J. Neurosci. (2002) 22: Krahe et al., (2002) J. Neurosci. 22:

Linear Stimulus Reconstruction Estimate filter which, when convolved with the spike train, yields an estimated stimulated “closest” to real stimulus Estimate filter which, when convolved with the spike train, yields an estimated stimulated “closest” to real stimulus Spike train (zero mean) Estimated stimulus Mean square error Optimal Wiener filter

NOISE smoothes out f-I curves

“stochastic resonance above threshold” Coding fraction versus noise intensity:

Modeling Electroreceptors: The Nelson Model (1996) High-Pass Filter Input Stochastic Spike Generator Spike generator assigns 0 or 1 spike per EOD cycle: multimodal histograms

Modeling Electroreceptors: The Extended LIFDT Model High-Pass Filter InputLIFDT Spike Train Parameters: without noise, receptor fires periodically (suprathreshold dynamics – no stochastic resonance)

Signal Detection: Count Spikes During Interval T T=255 msec

Fano Factor: Asymptotic Limit (Cox and Lewis, 1966)Regularisation:

Sensory Neurons ELL Pyramidal Cell Sensory Input Higher Brain

Feedback: Open vs Closed Loop Architecture Higher Brain Higher Brain Loop time  d

Delayed Feedback Neural Networks Afferent Input Higher Brain Areas The ELL; first stage of sensory processing

Jelte Bos’ data Andre’s data Longtin et al., Phys. Rev. A 41, 6992 (1990)

If one defines: corresponding to the stochastic diff. eq. : one gets a Fokker- Planck equation:

One can apply Ito or Stratonovich calculus, as for SDE’s. However, applicability is limited if there are complex eigenvalues or system is strongly nonlinear

TWO-STATE DESCRIPTION: S=±1 2 transition probabilities: For example, using Kramers approach:

DETERMINISTIC DELAYED BISTABILITY Stochastic approach does not yet get the whole picture! Stochastic approach does not yet get the whole picture!

Conclusions NOISE: many sources, many approaches, exercise caution (Ito vs Strato) NOISE: many sources, many approaches, exercise caution (Ito vs Strato) INFORMATION THEORY: usually makes assumptions, and even when it doesn’t, ask the question whether next cell cares. INFORMATION THEORY: usually makes assumptions, and even when it doesn’t, ask the question whether next cell cares. DELAYS: SDDE’s have no Fokker-Planck equivalent DELAYS: SDDE’s have no Fokker-Planck equivalent  tomorrow: linear response-like theory

OUTLOOK Second order field theory for stochastic neural dynamics with delays Second order field theory for stochastic neural dynamics with delays Figuring out how intrinsic neuron dynamics (bursting, coincidence detection, etc…) interact with correlated input Figuring out how intrinsic neuron dynamics (bursting, coincidence detection, etc…) interact with correlated input Figuring out interaction of noise and bursting Figuring out interaction of noise and bursting Forget about steady state! Forget about steady state! Whatever you do, think of the neural decoder… Whatever you do, think of the neural decoder…