HST 583 fMRI DATA ANALYSIS AND ACQUISITION

Slides:



Advertisements
Similar presentations
Sampling Design, Spatial Allocation, and Proposed Analyses Don Stevens Department of Statistics Oregon State University.
Advertisements

The Simple Linear Regression Model Specification and Estimation Hill et al Chs 3 and 4.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Brief introduction on Logistic Regression
Pattern Recognition and Machine Learning
CHAPTER 13 M ODELING C ONSIDERATIONS AND S TATISTICAL I NFORMATION “All models are wrong; some are useful.”  George E. P. Box Organization of chapter.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
The General Linear Model. The Simple Linear Model Linear Regression.
HST 583 fMRI DATA ANALYSIS AND ACQUISITION Neural Signal Processing for Functional Neuroimaging Emery N. Brown Neuroscience Statistics Research Laboratory.
An Introductory to Statistical Models of Neural Data SCS-IPM به نام خالق ناشناخته ها.
AGC DSP AGC DSP Professor A G Constantinides© Estimation Theory We seek to determine from a set of data, a set of parameters such that their values would.
2. Point and interval estimation Introduction Properties of estimators Finite sample size Asymptotic properties Construction methods Method of moments.
Testing models against data Bas Kooijman Dept theoretical biology Vrije Universiteit Amsterdam master course WTC.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
TSTAT_THRESHOLD (~1 secs execution) Calculates P=0.05 (corrected) threshold t for the T statistic using the minimum given by a Bonferroni correction and.
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Modeling Your Spiking Data with Generalized Linear Models.
Random Sampling, Point Estimation and Maximum Likelihood.
VI. Evaluate Model Fit Basic questions that modelers must address are: How well does the model fit the data? Do changes to a model, such as reparameterization,
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
ELEC 303 – Random Signals Lecture 18 – Classical Statistical Inference, Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 4, 2010.
Learning Theory Reza Shadmehr LMS with Newton-Raphson, weighted least squares, choice of loss function.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
PROCESS MODELLING AND MODEL ANALYSIS © CAPE Centre, The University of Queensland Hungarian Academy of Sciences Statistical Model Calibration and Validation.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
BCS547 Neural Decoding.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Autoregressive (AR) Spectral Estimation
Machine Learning 5. Parametric Methods.
Yi Jiang MS Thesis 1 Yi Jiang Dept. Of Electrical and Computer Engineering University of Florida, Gainesville, FL 32611, USA Array Signal Processing in.
Maximum likelihood estimators Example: Random data X i drawn from a Poisson distribution with unknown  We want to determine  For any assumed value of.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
Learning Theory Reza Shadmehr Distribution of the ML estimates of model parameters Signal dependent noise models.
Presentation : “ Maximum Likelihood Estimation” Presented By : Jesu Kiran Spurgen Date :
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 1: INTRODUCTION.
Labwork 3.
Stat 223 Introduction to the Theory of Statistics
Biointelligence Laboratory, Seoul National University
Estimator Properties and Linear Least Squares
DEEP LEARNING BOOK CHAPTER to CHAPTER 6
12. Principles of Parameter Estimation
Probability Theory and Parameter Estimation I
ICS 280 Learning in Graphical Models
Fundamentals of estimation and detection in signals and images
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
The general linear model and Statistical Parametric Mapping
Ch3: Model Building through Regression
The General Linear Model
Linear Mixed Models in JMP Pro
Maximum Likelihood Estimation
Chapter 2 Minimum Variance Unbiased estimation
Solving an estimation problem
Pattern Recognition and Machine Learning
Chapter 4, Regression Diagnostics Detection of Model Violation
Stat 223 Introduction to the Theory of Statistics
OVERVIEW OF LINEAR MODELS
The general linear model and Statistical Parametric Mapping
Learning Theory Reza Shadmehr
Hierarchical Models and
Parametric Methods Berlin Chen, 2005 References:
Mathematical Foundations of BME
The General Linear Model
12. Principles of Parameter Estimation
Applied Statistics and Probability for Engineers
Maximum Likelihood Estimation (MLE)
Presentation transcript:

HST 583 fMRI DATA ANALYSIS AND ACQUISITION A Review of Statistics for fMRI Data Analysis Emery N. Brown Massachusetts General Hospital Harvard Medical School/MIT Division of Health, Sciences and Technology December 2, 2002

Outline What Makes Up an fMRI Signal? Statistical Modeling of an fMRI Signal Maxmimum Likelihoood Estimation for fMRI Data Analysis Conclusions

THE STATISTICAL PARADIGM (Box, Tukey) Question Preliminary Data (Exploration Data Analysis) Models Experiment (Confirmatory Analysis) Model Fit Goodness-of-Fit not satisfactory Assessment Satisfactory Make an Inference Make a Decision

Case 3: fMRI Data Analysis Question: Can we construct an accurate statistical model to describe the spatial temporal patterns of activation in fMRI images from visual and motor cortices during combined motor and visual tasks? (Purdon et al., 2001; Solo et al., 2001) A STIMULUS-RESPONSE EXPERIMENT Acknowledgements: Chris Long and Brenda Marshall

What Makes Up An fMRI Signal? Hemodynamic Response/MR Physics             i) stimulus paradigm a) event-related b) block ii) blood flow iii) blood volume iv) hemoglobin and deoxy hemoglobin content Noise Stochastic i) physiologic ii) scanner noise Systematic i) motion artifact ii) drift iii) [distortion] iv) [registration], [susceptibility]

Physiologic Response Model: Block Design

Gamma Hemodynamic Response Model

Physiologic Model: Event-Related Design

Physiologic Model: Flow, Volume and Interaction Terms

Scanner and Physiologic Noise Models

DATA: The sequence of image intensity measurements on a single pixel.

fMRI Signal and Noise Model Measurement on a single pixel at time Physiologic response Activation coefficient Physiologic and Scanner Noise for We assume the are independent, identically distributed Gaussian random variables.

fMRI Signal Model Physiologic Response hemodynamic response input stimulus Gamma model of the hemodynamic response Assume we know the parameters of g(t).

MAXIMUM LIKELIHOOD Define the likelihood function , the joint probability density viewed as a function of the parameter with the data fixed. The maximum likelihood estimate of is That is, is a parameter value for which attains a maximum as a function of for fixed

ESTIMATION Joint Distribution Log Likelihood Maximum Likelihood

GOODNESS-OF-FIT/MODEL SELECTION An essential step, if not the most essential step in a data analysis, is to measures how well the model describes the data. This should be assessed before the model is used to make inferences about that data. Akaike’s Information Criterion For maximum likelihood estimates it measures the trade-off between maximizing the likelihood (minimizing ) and the numbers of parameters the model requires.

GOODNESS-OF-FIT Residual Plots: KS Plots: We can check the Gaussian assumption with our K-S plots. Measure correlation in the residuals to assess independence.

EVALUATION OF ESTIMATORS Given an estimator of based on Mean-Squared Error: Bias= Consistency: Efficiency: Achieves a minimum variance (Cramer-Rao Lower Bound)

FACTOIDS ABOUT MAXIMUM LIKELIHOOD ESTIMATES Generally biased. Consistent, hence asymptotically unbiased. Asymptotically efficient. Variance can be approximated by minus the inverse of the Fisher information matrix. If is the estimate of then is the estimate of

Cramer-Rao Lower Bound CRLB gives the lowest bound on the variance of an estimate.

CONFIDENCE INTERVALS The approximate probability density of the maximum likelihood estimates is the Gaussian probability density with mean and variance where is the Fisher information matrix An approximate confidence interval for a component of is

THE INFORMATION MATRIX

CONFIDENCE INTERVAL

Kolmogorov-Smirnov Test White Noise Model

Pixelwise Confidence Intervals for the Slice White Noise Model Pixelwise Confidence Intervals for the Slice

fMRI Signal and Noise Model 2 Measurement on a single pixel at time Physiologic response Activation coefficient Physiologic and Scanner Noise for We assume the are correlated noise AR(1) Gaussian random variables.

Simple Convolution Plus Correlated Noise

Kolmogorov-Smirnov Test Correlated Noise Model

Pixelwise Confidence Intervals for the Slice Correlated Noise Model Pixelwise Confidence Intervals for the Slice

AIC Difference = AIC Colored Noise-AIC White Noise

fMRI Signal and Noise Model 3 Measurement on a single pixel at time Physiologic response Physiologic and Scanner Noise for We assume the are independent, identically distributed Gaussian random variables.

Harmonic Regression Plus White Noise Model

AIC Difference Map= AIC Correlated Noise-AIC Harmonic Regression

Conclusions The white noise model gives a good description of the hemodynamic response The correlated noise model incorporates known physiologic and biophysical properties and hence yields a better fit The likelihood approach offers a unified way to formulate a model, compute confidence intervals, measure goodness of fit and most importantly make inferences.