Presentation is loading. Please wait.

Presentation is loading. Please wait.

The General Linear Model (GLM) Klaas Enno Stephan Wellcome Trust Centre for Neuroimaging University College London With many thanks to my colleagues in.

Similar presentations


Presentation on theme: "The General Linear Model (GLM) Klaas Enno Stephan Wellcome Trust Centre for Neuroimaging University College London With many thanks to my colleagues in."— Presentation transcript:

1 The General Linear Model (GLM) Klaas Enno Stephan Wellcome Trust Centre for Neuroimaging University College London With many thanks to my colleagues in the FIL methods group, particularly Stefan Kiebel, for useful slides SPM Course, ICN May 2008

2 Overview of SPM RealignmentSmoothing Normalisation General linear model Statistical parametric map (SPM) Image time-series Parameter estimates Design matrix Template Kernel Gaussian field theory p <0.05 Statisticalinference

3 Passive word listening versus rest Passive word listening versus rest 7 cycles of rest and listening 7 cycles of rest and listening Blocks of 6 scans with 7 sec TR Blocks of 6 scans with 7 sec TR Question: Is there a change in the BOLD response between listening and rest? Stimulus function One session A very simple fMRI experiment

4 stimulus function 1.Decompose data into effects and error 2.Form statistic using estimates of effects and error 1.Decompose data into effects and error 2.Form statistic using estimates of effects and error Make inferences about effects of interest Why? How? data linear model linear model effects estimate error estimate statistic Modelling the measured data

5 Time BOLD signal Time single voxel time series single voxel time series Voxel-wise time series analysis model specification model specification parameter estimation parameter estimation hypothesis statistic SPM

6 BOLD signal Time = error x1x1 x2x2 e Single voxel regression model

7 Mass-univariate analysis: voxel-wise GLM = + y y Model is specified by 1.Design matrix X 2.Assumptions about e Model is specified by 1.Design matrix X 2.Assumptions about e N: number of scans p: number of regressors N: number of scans p: number of regressors The design matrix embodies all available knowledge about experimentally controlled factors and potential confounds.

8 GLM assumes Gaussian spherical (i.i.d.) errors sphericity = iid: error covariance is scalar multiple of identity matrix: Cov(e) = 2 I sphericity = iid: error covariance is scalar multiple of identity matrix: Cov(e) = 2 I Examples for non-sphericity: non-identity non-independence

9 Parameter estimation = + Least squares parameter estimate (assuming iid error) Least squares parameter estimate (assuming iid error) Estimate parameters such that minimal y X

10 y e Design space defined by X x1x1 x2x2 A geometric perspective

11 What are the problems of this model? 1.BOLD responses have a delayed and dispersed form. HRF 2.The BOLD signal includes substantial amounts of low-frequency noise. 3.The data are serially correlated (temporally autocorrelated) this violates the assumptions of the noise model in the GLM

12 The response of a linear time-invariant (LTI) system is the convolution of the input with the system's response to an impulse (delta function). Problem 1: Shape of BOLD response Solution: Convolution model hemodynamic response function (HRF) expected BOLD response = input function impulse response function (HRF)

13 Convolution model of the BOLD response Convolve stimulus function with a canonical hemodynamic response function (HRF): HRF

14 Problem 2: Low-frequency noise Solution: High pass filtering discrete cosine transform (DCT) set S = residual forming matrix of DCT set

15 High pass filtering: example blue = data black = mean + low-frequency drift green = predicted response, taking into account low-frequency drift red = predicted response, NOT taking into account low-frequency drift

16 with 1 st order autoregressive process: AR(1) autocovariance function Problem 3: Serial correlations

17 Dealing with serial correlations Pre-colouring: impose some known autocorrelation structure on the data (filtering with matrix W ) and use Satterthwaite correction for dfs. Pre-whitening: 1. Use an enhanced noise model with hyperparameters for multiple error covariance components. 2. Use estimated autocorrelation to specify filter matrix W for whitening the data.

18 How do we define W? Enhanced noise model Remember how Gaussians are transformed linearly Choose W such that error covariance becomes spherical Conclusion: W is a function of V so how do we estimate V?

19 Multiple covariance components = Q1Q1 Q2Q2 Estimation of hyperparameters with ReML (restricted maximum likelihood). V enhanced noise model

20 Contrasts & statistical parametric maps Q: activation during listening ? c = Null hypothesis:

21 c = ReML- estimate t-statistic based on ML estimates

22 head movements arterial pulsations breathing eye blinks adaptation affects, fatigue, fluctuations in concentration, etc. Physiological confounds

23 x1x1 x2x2 x2*x2* y Correlated and orthogonal regressors When x 2 is orthogonalized with regard to x 1, only the parameter estimate for x 1 changes, not that for x 2 ! Correlated regressors = explained variance is shared between regressors

24 Outlook: further challenges correction for multiple comparisons variability in the HRF across voxels slice timing limitations of frequentist statistics Bayesian analyses GLM ignores interactions among voxels models of effective connectivity

25 Correction for multiple comparisons Mass-univariate approach: We apply the GLM to each of a huge number of voxels (usually > 100,000). Threshold of p<0.05 more than 5000 voxels significant by chance! Massive problem with multiple comparisons! Solution: Gaussian random field theory

26 Variability in the HRF HRF varies substantially across voxels and subjects For example, latency can differ by ± 1 second Solution: use multiple basis functions See talk on event-related fMRI

27 Summary Mass-univariate approach: same GLM for each voxel GLM includes all known experimental effects and confounds Convolution with a canonical HRF High-pass filtering to account for low-frequency drifts Estimation of multiple variance components (e.g. to account for serial correlations) Parametric statistics


Download ppt "The General Linear Model (GLM) Klaas Enno Stephan Wellcome Trust Centre for Neuroimaging University College London With many thanks to my colleagues in."

Similar presentations


Ads by Google