Biological Modeling of Neural Networks Week 11 – Variability and Noise: Autocorrelation Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Variation of.

Slides:



Advertisements
Similar presentations
BME 6938 Neurodynamics Instructor: Dr Sachin S. Talathi.
Advertisements

What is the neural code? Puchalla et al., What is the neural code? Encoding: how does a stimulus cause the pattern of responses? what are the responses.
Biological Modeling of Neural Networks: Week 9 – Coding and Decoding Wulfram Gerstner EPFL, Lausanne, Switzerland 9.1 What is a good neuron model? - Models.
Spike Train Statistics Sabri IPM. Review of spike train  Extracting information from spike trains  Noisy environment:  in vitro  in vivo  measurement.
Lecture 9: Population genetics, first-passage problems Outline: population genetics Moran model fluctuations ~ 1/N but not ignorable effect of mutations.
An Introductory to Statistical Models of Neural Data SCS-IPM به نام خالق ناشناخته ها.
Neurophysics Part 1: Neural encoding and decoding (Ch 1-4) Stimulus to response (1-2) Response to stimulus, information in spikes (3-4) Part 2: Neurons.
Spectral analysis for point processes. Error bars. Bijan Pesaran Center for Neural Science New York University.
What is the language of single cells? What are the elementary symbols of the code? Most typically, we think about the response as a firing rate, r(t),
Introduction: Neurons and the Problem of Neural Coding Laboratory of Computational Neuroscience, LCN, CH 1015 Lausanne Swiss Federal Institute of Technology.
Why are cortical spike trains irregular? How Arun P Sripati & Kenneth O Johnson Johns Hopkins University.
NOISE and DELAYS in NEUROPHYSICS Andre Longtin Center for Neural Dynamics and Computation Department of Physics Department of Cellular and Molecular Medicine.
1 3. Spiking neurons and response variability Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Part II: Population Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 6-9 Laboratory of Computational.
Romain Brette Computational neuroscience of sensory systems Dynamics of neural excitability.
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
Biological Modeling of Neural Networks: Week 11 – Continuum models: Cortical fields and perception Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Transients.
ENERGY-EFFICIENT RECURSIVE ESTIMATION BY VARIABLE THRESHOLD NEURONS Toby Berger, Cornell & Uva Chip Levy, Uva Medical Galen Reeves, Cornell & UCB CoSyNe.
Stable Propagation of Synchronous Spiking in Cortical Neural Networks Markus Diesmann, Marc-Oliver Gewaltig, Ad Aertsen Nature 402: Flavio Frohlich.
Spike-triggering stimulus features stimulus X(t) multidimensional decision function spike output Y(t) x1x1 x2x2 x3x3 f1f1 f2f2 f3f3 Functional models of.
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
The Integrate and Fire Model Gerstner & Kistler – Figure 4.1 RC circuit Threshold Spike.
Part I: Single Neuron Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 2-5 Laboratory of Computational.
Statistical analysis and modeling of neural data Lecture 4 Bijan Pesaran 17 Sept, 2007.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Lecture IV Statistical Models in Optical Communications DIRECT DETECTION G aussian approximation for single-shot link performance Receiver thermal noise.
Connected Populations: oscillations, competition and spatial continuum (field equations) Lecture 12 Course: Neural Networks and Biological Modeling Wulfram.
Goals Looking at data –Signal and noise Structure of signals –Stationarity (and not) –Gaussianity (and not) –Poisson (and not) Data analysis as variability.
Biological Modeling of Neural Networks: Week 15 – Population Dynamics: The Integral –Equation Approach Wulfram Gerstner EPFL, Lausanne, Switzerland 15.1.
Biological Modeling of Neural Networks: Week 13 – Membrane potential densities and Fokker-Planck Wulfram Gerstner EPFL, Lausanne, Switzerland 13.1 Review:
Biological Modeling of Neural Networks Week 4 – Reducing detail - Adding detail Wulfram Gerstner EPFL, Lausanne, Switzerland 4.2. Adding detail - synapse.
Biological Modeling of Neural Networks Week 3 – Reducing detail : Two-dimensional neuron models Wulfram Gerstner EPFL, Lausanne, Switzerland 3.1 From Hodgkin-Huxley.
Biological Modeling of Neural Networks: Week 14 – Dynamics and Plasticity Wulfram Gerstner EPFL, Lausanne, Switzerland 14.1 Reservoir computing - Complex.
Biological Modeling of Neural Networks: Week 9 – Adaptation and firing patterns Wulfram Gerstner EPFL, Lausanne, Switzerland 9.1 Firing patterns and adaptation.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
Biological Modeling of Neural Networks Week 8 – Noisy input models: Barrage of spike arrivals Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
Neural coding (1) LECTURE 8. I.Introduction − Topographic Maps in Cortex − Synesthesia − Firing rates and tuning curves.
Correlation-Induced Oscillations in Spatio-Temporal Excitable Systems Andre Longtin Physics Department, University of Ottawa Ottawa, Canada.
Lecture 9: Introduction to Neural Networks Refs: Dayan & Abbott, Ch 7 (Gerstner and Kistler, Chs 6, 7) D Amit & N Brunel, Cerebral Cortex 7, (1997)
Neural codes and spiking models. Neuronal codes Spiking models: Hodgkin Huxley Model (small regeneration) Reduction of the HH-Model to two dimensions.
Biological Modeling of Neural Networks Week 8 – Noisy output models: Escape rate and soft threshold Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
Biological Modeling of Neural Networks: Week 12 – Decision models: Competitive dynamics Wulfram Gerstner EPFL, Lausanne, Switzerland 12.1 Review: Population.
Study on synchronization of coupled oscillators using the Fokker-Planck equation H.Sakaguchi Kyushu University, Japan Fokker-Planck equation: important.
”When spikes do matter: speed and plasticity” Thomas Trappenberg 1.Generation of spikes 2.Hodgkin-Huxley equation 3.Beyond HH (Wilson model) 4.Compartmental.
Theoretical Neuroscience Physics 405, Copenhagen University Block 4, Spring 2007 John Hertz (Nordita) Office: rm Kc10, NBI Blegdamsvej Tel (office)
INTRODUCTION OF INTEGRATE AND FIRE MODEL Single neuron modeling By Pooja Sharma Research Scholar LNMIIT-Jaipur.
1 3. Simplified Neuron and Population Models Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Computing in carbon Basic elements of neuroelectronics Elementary neuron models -- conductance based -- modelers’ alternatives Wiring neurons together.
Lecture 4: more ion channels and their functions Na + channels: persistent K + channels: A current, slowly inactivating current, Ca-dependent K currents.
Biological Modeling of Neural Networks: Week 9 – Optimizing Neuron Models For Coding and Decoding Wulfram Gerstner EPFL, Lausanne, Switzerland 9.1 What.
Biological Modeling of Neural Networks Week 4 – Reducing detail - Adding detail Wulfram Gerstner EPFL, Lausanne, Switzerland 3.1 From Hodgkin-Huxley to.
Biological Modeling of Neural Networks Week 7 – Variability and Noise: The question of the neural code Wulfram Gerstner EPFL, Lausanne, Switzerland 7.1.
Chapter 3. Stochastic Dynamics in the Brain and Probabilistic Decision-Making in Creating Brain-Like Intelligence, Sendhoff et al. Course: Robots Learning.
Neuronal Dynamics: Computational Neuroscience of Single Neurons
1 2 Spike Coding Adrienne Fairhall Summary by Kim, Hoon Hee (SNU-BI LAB) [Bayesian Brain]
Lecture 8: Integrate-and-Fire Neurons References: Dayan and Abbott, sect 5.4 Gerstner and Kistler, sects , 5.5, 5.6, H Tuckwell, Introduction.
Biological Modeling of Neural Networks: Week 15 – Fast Transients and Rate models Wulfram Gerstner EPFL, Lausanne, Switzerland 15.1 Review Populations.
Lecture 2: Everything you need to know to know about point processes Outline: basic ideas homogeneous (stationary) Poisson processes Poisson distribution.
Biological Modeling of Neural Networks: Week 10 – Neuronal Populations Wulfram Gerstner EPFL, Lausanne, Switzerland 10.1 Cortical Populations - columns.
Biological Modeling of Neural Networks Week 2 – Biophysical modeling: The Hodgkin-Huxley model Wulfram Gerstner EPFL, Lausanne, Switzerland 2.1 Biophysics.
Biological Modeling of Neural Networks Week 4 Reducing detail: Analysis of 2D models Wulfram Gerstner EPFL, Lausanne, Switzerland 3.1 From Hodgkin-Huxley.
Week 1: A first simple neuron model/ neurons and mathematics Week 2: Hodgkin-Huxley models and biophysical modeling Week 3: Two-dimensional models and.
Biological Modeling of Neural Networks Week 10 – Variability and Noise: The question of the neural code Wulfram Gerstner EPFL, Lausanne, Switzerland 10.1.
Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing.
Effects of noisy adaptation on neural spiking statistics Tilo Schwalger,Benjamin Lindner, Karin Fisch, Jan Benda Max-Planck Institute for the Physics of.
Bayesian Brain - Chapter 11 Neural Models of Bayesian Belief Propagation Rajesh P.N. Rao Summary by B.-H. Kim Biointelligence Lab School of.
Persistent activity and oscillations in recurrent neural networks in the high-conductance regime Rubén Moreno-Bote with Romain Brette and Néstor Parga.
What does the synapse tell the axon?
H.Sebastian Seung, Daniel D. Lee, Ben Y. Reis, David W. Tank  Neuron 
Shunting Inhibition Modulates Neuronal Gain during Synaptic Excitation
Presentation transcript:

Biological Modeling of Neural Networks Week 11 – Variability and Noise: Autocorrelation Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Variation of membrane potential - white noise approximation 11.2 Autocorrelation of Poisson 11.3 Noisy integrate-and-fire - superthreshold and subthreshold 11.4 Escape noise - stochastic intensity 11.5 Renewal models Wulfram Gerstner EPFL, Lausanne, Switzerland Reading for week 11: NEURONAL DYNAMICS Ch Ch Ch. 9.1 Cambridge Univ. Press

Crochet et al., 2011 awake mouse, cortex, freely whisking, Spontaneous activity in vivo 11.1 Review from week 10 Variability - of membrane potential? - of spike timing?

11.1 Review from week 10 Fluctuations -of membrane potential -of spike times fluctuations=noise? model of fluctuations? relevance for coding? source of fluctuations? In vivo data  looks ‘noisy’ In vitro data  fluctuations

- Intrinsic noise (ion channels) Na + K+K+ -Finite number of channels -Finite temperature -Network noise (background activity) -Spike arrival from other neurons -Beyond control of experimentalist Review from week 10

- Intrinsic noise (ion channels) Na + K+K+ -Network noise Review from week 10 small contribution! big contribution! In vivo data  looks ‘noisy’ In vitro data  small fluctuations  nearly deterministic

11.1 Review from week 10: Calculating the mean mean: assume Poisson process rate of inhomogeneous Poisson process use for exercises use for next slides

11.1. Fluctuation of potential for a passive membrane, predict -mean -variance of membrane potential fluctuations Passive membrane =Leaky integrate-and-fire without threshold Passive membrane

EPSC Synaptic current pulses of shape  Passive membrane Fluctuation of current/potential Blackboard, Math detour: White noise I(t) Fluctuating input current

11.1 Calculating autocorrelations Autocorrelation Mean: Blackboard, Math detour I(t) Fluctuating input current

White noise: Exercise now Expected voltage at time t Input starts here Assumption: far away from theshold Variance of voltage at time t Next lecture: 10:15 Report variance as function of time!

white noise (to mimic stochastic spike arrival) Math argument later 11.1 Calculating autocorrelations for stochastic spike arrival Image: Gerstner et al. (2014), Neuronal Dynamics

11.1 Calculating autocorrelations rate of inhomogeneous Poisson process Autocorrelation Mean: Blackboard, Math detour

Assumption: stochastic spiking rate Poisson spike arrival: Mean and autocorrelation of filtered signal mean Autocorrelation of output Autocorrelation of input Filter 11.1 Mean and autocorrelation of filtered spike signal

Biological Modeling of Neural Networks Week 11 – Variability and Noise: Autocorrelation Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Variation of membrane potential - white noise approximation 11.2 Autocorrelation of Poisson 11.3 Noisy integrate-and-fire - superthreshold and subthreshold 11.4 Escape noise - stochastic intensity 11.5 Renewal models

Stochastic spike arrival: Justify autocorrelation of spike input: Poisson process in discrete time In each small time step Prob. Of firing Firing independent between one time step and the next Blackboard 11.2 Autocorrelation of Poisson (preparation)

Stochastic spike arrival: excitation, total rate Exercise 3 now: Poisson process in continuous time Show that autocorrelation for In each small time step Prob. Of firing Firing independent between one time step and the next Show that in a a long interval of duration T, the expected number of spikes is Next lecture: 10:40

Quiz – 1. Autocorrelation of Poisson spike train The Autocorrelation (continuous time) Has units [ ] probability (unit-free) [ ] probability squared (unit-free) [ ] rate (1 over time) [ ] (1 over time)-squred

Probability of spike in time step: Autocorrelation of Poisson spike train math detour now! Probability of spike in step n AND step k Autocorrelation (continuous time)

Assumption: stochastic spiking (Poisson) rate Autocorrelation of output Autocorrelation of input (Poisson) We integrate twice! Autocorrelation of Poisson: units

Stochastic spike arrival: excitation, total rate u Synaptic current pulses Exercise 2 Homework: stochastic spike arrival 1.Assume that for t>0 spikes arrive stochastically with rate -Calculate mean voltage 2. Assume autocorrelation - Calculate

Biological Modeling of Neural Networks Week 11 – Variability and Noise: Autocorrelation Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Variation of membrane potential - white noise approximation 11.2 Autocorrelation of Poisson 11.3 Noisy integrate-and-fire - superthreshold and subthreshold 11.4 Escape noise - stochastic intensity 11.5 Renewal models

11.3 Noisy Integrate-and-fire for a passive membrane, we can analytically predict the mean of membrane potential fluctuations Passive membrane =Leaky integrate-and-fire without threshold Passive membrane ADD THRESHOLD  Leaky Integrate-and-Fire

I effective noise current u(t) noisy input/ diffusive noise/ stochastic spike arrival LIF 11.3 Noisy Integrate-and-fire

I(t) fluctuating input current fluctuating potential Random spike arrival 11.3 Noisy Integrate-and-fire

stochastic spike arrival in I&F – interspike intervals I ISI distribution 11.3 Noisy Integrate-and-fire (noisy input) white noise Image: Gerstner et al. (2014), Neuronal Dynamics,

Superthreshold vs. Subthreshold regime 11.3 Noisy Integrate-and-fire (noisy input) Image: Gerstner et al. (2014), Neuronal Dynamics, Cambridge Univ. Press; See: Konig et al. (1996)

u(t) Noisy integrate-and-fire (noisy input) noisy input/ diffusive noise/ stochastic spike arrival subthreshold regime: - firing driven by fluctuations - broad ISI distribution - in vivo like ISI distribution Image: Gerstner et al. (2014), Neuronal Dynamics,

Crochet et al., 2011 awake mouse, freely whisking, Spontaneous activity in vivo review- Variability in vivo Variability of membrane potential? Subthreshold regime Image: Gerstner et al. (2014), Neuronal Dynamics, Cambridge Univ. Press; Courtesy of: Crochet et al. (2011)

fluctuating potential Passive membrane for a passive membrane, we can analytically predict the amplitude of membrane potential fluctuations Leaky integrate-and-fire in subthreshold regime can explain variations of membrane potential and ISI Stochastic spike arrival: 11.3 Noisy Integrate-and-fire (noisy input)

Biological Modeling of Neural Networks Week 11 – Variability and Noise: Autocorrelation Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Variation of membrane potential - white noise approximation 11.2 Autocorrelation of Poisson 11.3 Noisy integrate-and-fire - superthreshold and subthreshold 11.4 Escape noise - stochastic intensity 11.5 Renewal models

- Intrinsic noise (ion channels) Na + K+K+ -Finite number of channels -Finite temperature -Network noise (background activity) -Spike arrival from other neurons -Beyond control of experimentalist Review: Sources of Variability small contribution! big contribution! Noise models?

escape process, stochastic intensity stochastic spike arrival (diffusive noise) u(t) t escape ratenoisy integration Relation between the two models: see Ch. 9.4 of Neuronal Dynamics Now: Escape noise! 11.4 Noise models: Escape noise vs. input noise

escape process u(t) t escape rate u 11.4 Escape noise escape rate Example: leaky integrate-and-fire model

escape process u(t) t escape rate u 11.4 stochastic intensity examples Escape rate = stochastic intensity of point process

u(t) t escape rate u 11.4 mean waiting time t I(t) mean waiting time, after switch 1ms Blackboard, Math detour

u(t) t 11.4 escape noise/stochastic intensity Escape rate = stochastic intensity of point process

Quiz 4 Escape rate/stochastic intensity in neuron models [ ] The escape rate of a neuron model has units one over time [ ] The stochastic intensity of a point process has units one over time [ ] For large voltages, the escape rate of a neuron model always saturates at some finite value [ ] After a step in the membrane potential, the mean waiting time until a spike is fired is proportional to the escape rate [ ] After a step in the membrane potential, the mean waiting time until a spike is fired is equal to the inverse of the escape rate [ ] The stochastic intensity of a leaky integrate-and-fire model with reset only depends on the external input current but not on the time of the last reset [ ] The stochastic intensity of a leaky integrate-and-fire model with reset depends on the external input current AND on the time of the last reset

Biological Modeling of Neural Networks Week 11 – Variability and Noise: Autocorrelation Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Variation of membrane potential - white noise approximation 11.2 Autocorrelation of Poisson 11.3 Noisy integrate-and-fire - superthreshold and subthreshold 11.4 Escape noise - stochastic intensity 11.5 Renewal models

11.5. Interspike Intervals for time-dependent input Example: nonlinear integrate-and-fire model deterministic part of input noisy part of input/intrinsic noise  escape rate Example: exponential stochastic intensity t

escape process u(t) Survivor function t escape rate Interspike Interval distribution (time-dependent inp.) tt Blackboard

escape process A u(t) Survivor function t escape rate Interval distribution Survivor function escape rate Examples now u Interspike Intervals

escape rate Example: I&F with reset, constant input Survivor function 1 Interval distribution Renewal theory

escape rate Example: I&F with reset, time-dependent input, Survivor function 1 Interval distribution Time-dependent Renewal theory

escape rate neuron with relative refractoriness, constant input Survivor function 1 Interval distribution Homework assignement: Exercise 4

11.5. Renewal process, firing probability Escape noise = stochastic intensity -Renewal theory - hazard function - survivor function - interval distribution -time-dependent renewal theory -discrete-time firing probability -Link to experiments  basis for modern methods of neuron model fitting THE END

Model of ‘Decoding’ Predict intended arm movement, given Spike Times Application: Neuroprosthetics frontal cortex Outlook: Helping Humans Many groups world wide work on this problem! motor cortex