CHARACTERIZATION OF NONLINEAR NEURON RESPONSES AMSC 664 Final Presentation Matt Whiteway Dr. Daniel A. Butts Neuroscience.

Slides:



Advertisements
Similar presentations
Brief introduction on Logistic Regression
Advertisements

Various Regularization Methods in Computer Vision Min-Gyu Park Computer Vision Lab. School of Information and Communications GIST.
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Notes Sample vs distribution “m” vs “µ” and “s” vs “σ” Bias/Variance Bias: Measures how much the learnt model is wrong disregarding noise Variance: Measures.
Kriging.
Biological Modeling of Neural Networks: Week 9 – Coding and Decoding Wulfram Gerstner EPFL, Lausanne, Switzerland 9.1 What is a good neuron model? - Models.
Spike Train Statistics Sabri IPM. Review of spike train  Extracting information from spike trains  Noisy environment:  in vitro  in vivo  measurement.
Model assessment and cross-validation - overview
An Introductory to Statistical Models of Neural Data SCS-IPM به نام خالق ناشناخته ها.
A Short Introduction to Curve Fitting and Regression by Brad Morantz
EARS1160 – Numerical Methods notes by G. Houseman
III-28 [122] Spike Pattern Distributions in Model Cortical Networks Joanna Tyrcha, Stockholm University, Stockholm; John Hertz, Nordita, Stockholm/Copenhagen.
What is the language of single cells? What are the elementary symbols of the code? Most typically, we think about the response as a firing rate, r(t),
Visual Recognition Tutorial
Reading population codes: a neural implementation of ideal observers Sophie Deneve, Peter Latham, and Alexandre Pouget.
For stimulus s, have estimated s est Bias: Cramer-Rao bound: Mean square error: Variance: Fisher information How good is our estimate? (ML is unbiased:
Forecasting JY Le Boudec 1. Contents 1.What is forecasting ? 2.Linear Regression 3.Avoiding Overfitting 4.Differencing 5.ARMA models 6.Sparse ARMA models.
Parameter Estimation: Maximum Likelihood Estimation Chapter 3 (Duda et al.) – Sections CS479/679 Pattern Recognition Dr. George Bebis.
Application of Statistical Techniques to Neural Data Analysis Aniket Kaloti 03/07/2006.
CHAPTER 4: Parametric Methods. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 Parametric Estimation X = {
CHAPTER 4: Parametric Methods. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 Parametric Estimation Given.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Real-time optimization of neurophysiology experiments Jeremy Lewi 1, Robert Butera 1, Liam Paninski 2 1 Department of Bioengineering, Georgia Institute.
Helsinki University of Technology Adaptive Informatics Research Centre Finland Variational Bayesian Approach for Nonlinear Identification and Control Matti.
Introduction to plausible values National Research Coordinators Meeting Madrid, February 2010.
Ising Models for Neural Data John Hertz, Niels Bohr Institute and Nordita work done with Yasser Roudi (Nordita) and Joanna Tyrcha (SU) Math Bio Seminar,
沈致远. Test error(generalization error): the expected prediction error over an independent test sample Training error: the average loss over the training.
Modeling Your Spiking Data with Generalized Linear Models.
Biological Modeling of Neural Networks: Week 9 – Adaptation and firing patterns Wulfram Gerstner EPFL, Lausanne, Switzerland 9.1 Firing patterns and adaptation.
MML Inference of RBFs Enes Makalic Lloyd Allison Andrew Paplinski.
1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,
Projects: 1.Predictive coding in balanced spiking networks (Erwan Ledoux). 2.Using Canonical Correlation Analysis (CCA) to analyse neural data (David Schulz).
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
SUPA Advanced Data Analysis Course, Jan 6th – 7th 2009 Advanced Data Analysis for the Physical Sciences Dr Martin Hendry Dept of Physics and Astronomy.
Encoding/Decoding of Arm Kinematics from Simultaneously Recorded MI Neurons Y. Gao, E. Bienenstock, M. Black, S.Shoham, M.Serruya, J. Donoghue Brown Univ.,
Part II: Model Class Selection Given: Dynamic data from system and set of candidate model classes where each model class defines a set of possible predictive.
Lecture 4: Statistics Review II Date: 9/5/02  Hypothesis tests: power  Estimation: likelihood, moment estimation, least square  Statistical properties.
2011 COURSE IN NEUROINFORMATICS MARINE BIOLOGICAL LABORATORY WOODS HOLE, MA Introduction to Spline Models or Advanced Connect-the-Dots Uri Eden BU Department.
BCS547 Neural Decoding. Population Code Tuning CurvesPattern of activity (r) Direction (deg) Activity
INTRODUCTION TO Machine Learning 3rd Edition
Image Stabilization by Bayesian Dynamics Yoram Burak Sloan-Swartz annual meeting, July 2009.
Biological Modeling of Neural Networks: Week 9 – Optimizing Neuron Models For Coding and Decoding Wulfram Gerstner EPFL, Lausanne, Switzerland 9.1 What.
3.7 Adaptive filtering Joonas Vanninen Antonio Palomino Alarcos.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Over-fitting and Regularization Chapter 4 textbook Lectures 11 and 12 on amlbook.com.
Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses Jonathan Pillow HHMI and NYU Oct 5, Course.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 12: Advanced Discriminant Analysis Objectives:
Machine Learning 5. Parametric Methods.
M.Sc. in Economics Econometrics Module I Topic 4: Maximum Likelihood Estimation Carol Newman.
1 2 Spike Coding Adrienne Fairhall Summary by Kim, Hoon Hee (SNU-BI LAB) [Bayesian Brain]
6. Population Codes Presented by Rhee, Je-Keun © 2008, SNU Biointelligence Lab,
Review of statistical modeling and probability theory Alan Moses ML4bio.
Ch 3. Likelihood Based Approach to Modeling the Neural Code Bayesian Brain: Probabilistic Approaches to Neural Coding eds. K Doya, S Ishii, A Pouget, and.
From cortical anisotropy to failures of 3-D shape constancy Qasim Zaidi Elias H. Cohen State University of New York College of Optometry.
July 23, BSA, a Fast and Accurate Spike Train Encoding Scheme Benjamin Schrauwen.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Computacion Inteligente Least-Square Methods for System Identification.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Overfitting, Bias/Variance tradeoff. 2 Content of the presentation Bias and variance definitions Parameters that influence bias and variance Bias and.
Presentation : “ Maximum Likelihood Estimation” Presented By : Jesu Kiran Spurgen Date :
The Neural Code Baktash Babadi SCS, IPM Fall 2004.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
HST 583 fMRI DATA ANALYSIS AND ACQUISITION
Deep Feedforward Networks
Maximum Likelihood Estimation
Estimating Networks With Jumps
Graduate School of Information Sciences, Tohoku University
Cross-validation for the selection of statistical models
Multivariate Methods Berlin Chen
Presentation transcript:

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES AMSC 664 Final Presentation Matt Whiteway Dr. Daniel A. Butts Neuroscience and Cognitive Science (NACS) Applied Mathematics and Scientific Computation (AMSC) Biological Sciences Graduate Program (BISI)

The Question  What is the functional relationship between a neuron’s stimulus and response? STIMULUS RESPONSE Introduction MLEs NIM Testing Conclusion

The Question  What is the functional relationship between a neuron’s stimulus and response?  Divide up total interval into discrete time bins  How do we model ? Introduction MLEs NIM Testing Conclusion

The Models  Moment-based estimators 1. Spike Triggered Average (STA) 2. Spike Triggered Covariance (STC)  Maximum Likelihood estimators 4. Generalized Linear Model (GLM) 5. Generalized Quadratic Model (GQM) 6. Nonlinear Input Model (NIM) Introduction MLEs NIM Testing Conclusion

Maximum Likelihood Estimators  We can use parametric models, and take into account  Background firing rate  Stimulus covariates  History covariates  Generalized Linear Model  is a constant  is the stimulus and are the stimulus coefficients  is the spike history and are the history coefficients Introduction MLEs NIM Testing Conclusion

Maximum Likelihood Estimators  is a constant  is the stimulus and are the stimulus coefficients  is the spike history and are the history coefficients  Generalized Linear Model  Generalized Quadratic Model  Nonlinear Input Model Introduction MLEs NIM Testing Conclusion

Maximum Likelihood Estimators  For the likelihood function we assume the spiking probabilities for each bin are independent  Taking logs,  To include regularization, Introduction MLEs NIM Testing Conclusion

Maximum Likelihood Estimators  MLEs are the set of parameters that maximize this function  If F(u) is convex in u and log(F(u)) is concave in u, then there will only be a single global maximum (Paninski 2004)  Used MATLAB’s fminunc routine (faster than my own)  GLM has a single global maximum  GQM in practice has 2 maxima  NIM can have more, though in practice usually find 1 Introduction MLEs NIM Testing Conclusion

Nonlinear Input Model  is a constant  is the stimulus and are the stimulus coefficients  is the spike history and are the history coefficients  Focus on NIM  Take to be rectified linear functions  Constrain Introduction MLEs NIM Testing Conclusion

Model Selection  How to choose N + and N_?  Akaike’s Information Criterion (s is length of filter)  Bayesian Information Criterion (n is size of data set) Introduction MLEs NIM Testing Conclusion

Model Selection AIC BIC Introduction MLEs NIM Testing Conclusion

Model Validation  MLEs are consistent – with large enough sample size, MLE will be arbitrarily close to true parameters ( )  Synthetically create data with pre-made filters, and estimate these filters using different sample sizes  MSE between estimates and original filters should go to zero Introduction MLEs NIM Testing Conclusion

Model Validation Results for time bins (about 1800 spikes) Introduction MLEs NIM Testing Conclusion Inhibitory FilterHistory Dependence

Model Validation Introduction MLEs NIM Testing Conclusion

Model Testing  Relative Log-likelihood per spike  0 for the model that predicts the average firing rate  Can be as large as the single-spike information  Higher values indicate the model is preserving more of the information that is present in the actual spike (on average) Introduction MLEs NIM Testing Conclusion

Model Testing  Upsampling Factor of 1  Upsampling Factor of 2 Introduction MLEs NIM Testing Conclusion STA STC GLM GQM NIM NIMh

Model Testing  Fraction of Variance Explained  If model predicts average firing rate, FVE = 0  If model predicts exact firing rate, FVE = 1 Introduction MLEs NIM Testing Conclusion

Model Testing  Where does r obs come from? Introduction MLEs NIM Testing Conclusion

Model Testing Introduction MLEs NIM Testing Conclusion

Model Testing  Model is able to explain more of the variation for large time bins Introduction MLEs NIM Testing Conclusion

Schedule  PHASE I (October-December)  Implement and validate STA (October)  Implement and validate GLM with regularization (November-December)  Complete mid-year progress report and presentation (December) Introduction MLEs NIM Testing Conclusion

Schedule  PHASE II (January-May)  Implement quasi-Newton method for gradient descent (January)  Implement and validate STC (January-February)  Implement and validate GQM with regularization (February)  Implement and validate NIM with regularization using rectified linear upstream functions (March)  Test all models (April)  Complete final report and presentation (April-May) Introduction MLEs NIM Testing Conclusion

Schedule  PHASE III (Other features to consider)  Adding history component  Representing the filters using basis functions  Representing the nonlinearities of the NIM using basis functions (more difficult – optimization has to be modified)  Networks…? Introduction MLEs NIM Testing Conclusion

Deliverables  All presentations  All reports  Commented code for all models, model validation and model testing  Dataset used for validation and testing, mat files that contain results of testing Introduction MLEs NIM Testing Conclusion

References  Chichilnisky, E.J. (2001) A simple white noise analysis of neuronal light responses. Network: Comput. Neural Syst., 12,  Schwartz, O., Chichilnisky, E. J., & Simoncelli, E. P. (2002). Characterizing neural gain control using spike-triggered covariance. Advances in neural information processing systems, 1,  Paninski, L. (2004) Maximum Likelihood estimation of cascade point-process neural encoding models. Network: Comput. Neural Syst.,15,  Schwartz, O. et al. (2006) Spike-triggered neural characterization. Journal of Vision, 6,  Paninski, L., Pillow, J., and Lewi, J. (2006) Statistical models for neural encoding, decoding, and optimal stimulus design.  Park, I., and Pillow, J. (2011) Bayesian Spike-Triggered Covariance Analysis. Adv. Neural Information Processing Systems,24,  Butts, D. A., Weng, C., Jin, J., Alonso, J. M., & Paninski, L. (2011). Temporal precision in the visual pathway through the interplay of excitation and stimulus-driven suppression. The Journal of Neuroscience, 31(31),  McFarland, J.M., Cui, Y., and Butts, D.A. (2013) Inferring nonlinear neuronal computation based on physiologically plausible inputs. PLoS Computational Biology. Introduction MLEs NIM Testing Conclusion