Download presentation

Presentation is loading. Please wait.

Published byAlexis Sherman Modified over 4 years ago

1
NOISE and DELAYS in NEUROPHYSICS Andre Longtin Center for Neural Dynamics and Computation Department of Physics Department of Cellular and Molecular Medicine UNIVERSITY OF OTTAWA, Canada

2
OUTLINE Modeling Single Neuron noise Modeling Single Neuron noise leaky integrate and fire leaky integrate and fire quadratic integrate and fire quadratic integrate and fire “transfer function” approach “transfer function” approach Modeling response to signals Modeling response to signals Information theory Information theory Delayed dynamics Delayed dynamics

3
MOTIVATION for STUDYING NOISE

4
“Noise” in the neuroscience literature As an input with many frequency components over a particular band, of similar amplitudes, and scattered phases As the resulting current from the integration of many independent, excitatory and inhibitory synaptic events at the soma As the maintained discharge of some neurons As « cross-talk » responses from indirectly stimulated neurons As « internal », resulting from the probabilistic gating of voltage- dependent ion channels As « synaptic », resulting from the stochastic nature of vesicle release at the synaptic cleft Segundo et al., Origins and Self Organization, 1994

5
Leaky Integrate-and-fire with + and - Feedback f = firing rate function

6
Firing Rate Functions Or stochastic: Noise free:

8
Noise induced Stochastic Gain Control Resonance

9
For Poisson input (Campbell’s theorem): mean conductance ~ mean input rate standard deviation σ ~ sqrt(mean rate)

10
NOISE smoothes out f-I curves

11
WHAT QUADRATIC INTEGRATE-AND-FIRE MODEL? Technically more difficult Technically more difficult Which variable to use? Which variable to use? On the real line? On the real line? On a circle? On a circle?

12
Information-theoretic approaches Linear encoding versus nonlinear processing Linear encoding versus nonlinear processing Rate code, long time constant, integrator Rate code, long time constant, integrator Time code, small time constant, coincidence detector (reliability) Time code, small time constant, coincidence detector (reliability) Interspike interval code (ISI reconstruction) Interspike interval code (ISI reconstruction) Linear correlation coefficient Linear correlation coefficient Coherence Coherence Coding fraction Coding fraction Mutual information Mutual information Challenge: Biophysics of coding Challenge: Biophysics of coding Forget the biophysics? Use better (mesoscopic ?) variables? Forget the biophysics? Use better (mesoscopic ?) variables?

13
Neuroscience101 (Continued): Number of spikes In time interval T: Spike train: Raster Plot: Random variables Interspike Intervals (ISI):

14
Information Theoretic Calculations: Gaussian Noise Stimulus S Spike Train X Neuron ??? Coherence Function: Mutual Information Rate:

15
Stimulus Protocol: Study effect of (stimulus contrast) and f c (stimulus bandwidth) on coding.

16
Information Theory:

17
Linear Response Calculation for Fourier transform of spike train: susceptibility unperturbed spike train Spike Train Spec = Background Spec + (transfer function*Signal Spec)

18
CV == INTERVAL mean / INTERVAL Standard deviation

19
Wiener Khintchine Power spectrum Autocorrelation Integral of S over all frequencies = C(0) = signal variance Integral of C over all time lags = S(0) = signal intensity

20
Signal Detection Theory: ROC curve:

21
Information Theory Actual signal Reconstructed signal The stimulus can be well characterized (electric field). This allows for detailed signal processing analysis. Gabbiani et al., Nature (1996) 384:564-567. Bastian et al., J. Neurosci. (2002) 22:4577-4590. Krahe et al., (2002) J. Neurosci. 22:2374-2382.

22
Linear Stimulus Reconstruction Estimate filter which, when convolved with the spike train, yields an estimated stimulated “closest” to real stimulus Estimate filter which, when convolved with the spike train, yields an estimated stimulated “closest” to real stimulus Spike train (zero mean) Estimated stimulus Mean square error Optimal Wiener filter

23
NOISE smoothes out f-I curves

24
“stochastic resonance above threshold” Coding fraction versus noise intensity:

25
Modeling Electroreceptors: The Nelson Model (1996) High-Pass Filter Input Stochastic Spike Generator Spike generator assigns 0 or 1 spike per EOD cycle: multimodal histograms

26
Modeling Electroreceptors: The Extended LIFDT Model High-Pass Filter InputLIFDT Spike Train Parameters: without noise, receptor fires periodically (suprathreshold dynamics – no stochastic resonance)

27
Signal Detection: Count Spikes During Interval T T=255 msec

28
Fano Factor: Asymptotic Limit (Cox and Lewis, 1966)Regularisation:

29
Sensory Neurons ELL Pyramidal Cell Sensory Input Higher Brain

30
Feedback: Open vs Closed Loop Architecture Higher Brain Higher Brain Loop time d

31
Delayed Feedback Neural Networks Afferent Input Higher Brain Areas The ELL; first stage of sensory processing

34
Jelte Bos’ data Andre’s data Longtin et al., Phys. Rev. A 41, 6992 (1990)

36
If one defines: corresponding to the stochastic diff. eq. : one gets a Fokker- Planck equation:

37
One can apply Ito or Stratonovich calculus, as for SDE’s. However, applicability is limited if there are complex eigenvalues or system is strongly nonlinear

39
TWO-STATE DESCRIPTION: S=±1 2 transition probabilities: For example, using Kramers approach:

40
DETERMINISTIC DELAYED BISTABILITY Stochastic approach does not yet get the whole picture! Stochastic approach does not yet get the whole picture!

41
Conclusions NOISE: many sources, many approaches, exercise caution (Ito vs Strato) NOISE: many sources, many approaches, exercise caution (Ito vs Strato) INFORMATION THEORY: usually makes assumptions, and even when it doesn’t, ask the question whether next cell cares. INFORMATION THEORY: usually makes assumptions, and even when it doesn’t, ask the question whether next cell cares. DELAYS: SDDE’s have no Fokker-Planck equivalent DELAYS: SDDE’s have no Fokker-Planck equivalent tomorrow: linear response-like theory

42
OUTLOOK Second order field theory for stochastic neural dynamics with delays Second order field theory for stochastic neural dynamics with delays Figuring out how intrinsic neuron dynamics (bursting, coincidence detection, etc…) interact with correlated input Figuring out how intrinsic neuron dynamics (bursting, coincidence detection, etc…) interact with correlated input Figuring out interaction of noise and bursting Figuring out interaction of noise and bursting Forget about steady state! Forget about steady state! Whatever you do, think of the neural decoder… Whatever you do, think of the neural decoder…

Similar presentations

© 2019 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google