Presentation is loading. Please wait.

Presentation is loading. Please wait.

SPM short course – Mai 2007 Linear Models and Contrasts Jean-Baptiste Poline Neurospin,I2BM, CEA Saclay, France.

Similar presentations


Presentation on theme: "SPM short course – Mai 2007 Linear Models and Contrasts Jean-Baptiste Poline Neurospin,I2BM, CEA Saclay, France."— Presentation transcript:

1 SPM short course – Mai 2007 Linear Models and Contrasts Jean-Baptiste Poline Neurospin,I2BM, CEA Saclay, France

2 realignment & coregistration smoothing normalisation Corrected p-values images Adjusted data Design matrix Anatomical Reference Spatial filter Random Field Theory Your question: a contrast Statistical Map Uncorrected p-values General Linear Model Linear fit Ü statistical image

3 w Make sure we understand the testing procedures : t- and F-tests Plan w An example – almost real w Make sure we know all about the estimation (fitting) part.... w But what do we test exactly ?

4 Temporal series fMRI Statistical image (SPM) voxel time course One voxel = One test (t, F,...) amplitude time General Linear Model Üfitting Üstatistical image

5 Regression example… = ++ voxel time series box-car reference function Mean value Fit the GLM

6 Regression example… = ++ voxel time series box-car reference function Mean value Fit the GLM

7 …revisited : matrix form = + + s =++ Y error 1 2 f(t)

8 Box car regression: design matrix… =+ = +YX data vector (voxel time series) design matrix parameters error vector

9 Add more reference functions... Discrete cosine transform basis functions

10 …design matrix = + = + YX data vector design matrix parameters error vector = the betas (here : 1 to 9)

11 Fitting the model = finding some estimate of the betas = minimising the sum of square of the residuals S 2 raw fMRI time series adjusted for low Hz effects residuals fitted high-pass filter fitted box-car the squared values of the residuals s 2 number of time points minus the number of estimated betas

12 w We put in our model regressors (or covariates) that represent how we think the signal is varying (of interest and of no interest alike) QUESTION IS : WHICH ONE TO INCLUDE ? Take home... w Coefficients (= parameters) are estimated using the Ordinary Least Squares (OLS) by minimizing the fluctuations, - variability – variance – of the noise – the residuals w Because the parameters depend on the scaling of the regressors included in the model, one should be careful in comparing manually entered regressors w The residuals, their sum of squares and the resulting tests (t,F), do not depend on the scaling of the regressors.

13 w Make sure we understand t and F tests Plan w Make sure we all know about the estimation (fitting) part.... w But what do we test exactly ? w An example – almost real

14 A contrast = a linear combination of parameters: c´ c = divide by estimated standard deviation T test - one dimensional contrasts - SPM{t} SPM{t} T = contrast of estimated parameters variance estimate T = s 2 c(XX) + c cb box-car amplitude > 0 ? = > 0 ? => Compute 1 x b + 0 x b + 0 x b + 0 x b + 0 x b +... and b b b b b....

15 How is this computed ? (t-test) contrast of estimated parameters variance estimate Y = X + ~ N(0,I) (Y : at one position) b = (XX) + XY (b: estimate of ) -> beta??? images e = Y - Xb (e: estimate of ) s 2 = (ee/(n - p)) (s: estimate of n: time points, p: parameters) -> 1 image ResMS -> 1 image ResMS Estimation [Y, X] [b, s] Test [b, s 2, c] [cb, t] Var(cb) = s 2 c(XX) + c (compute for each contrast c, proportional to s 2 ) t = cb / sqrt(s 2 c(XX) + c) (cb -> images spm_con??? compute the t images -> images spm_t??? ) compute the t images -> images spm_t??? ) under the null hypothesis H 0 : t ~ Student-t( df ) df = n-p

16 Tests multiple linear hypotheses : Does X1 model anything ? F-test : a reduced model or... This (full) model ? H 0 : True (reduced) model is X 0 X1X1 X0X0 S2S2 Or this one? X0X0 S02S02 F = error variance estimate additional variance accounted for by tested effects F ~ ( S S 2 ) / S 2

17 c = SPM{F} tests multiple linear hypotheses. Ex : does DCT set model anything? F-test : a reduced model or... multi-dimensional contrasts ? test H 0 : c´ b = 0 ? H 0 : 3-9 = ( ) X 1 ( 3-9 ) X0X0 This model ?Or this one ? H 0 : True model is X 0 X0X0

18 How is this computed ? (F-test) Error variance estimate additional variance accounted for by tested effects Test [b, s, c] [ess, F] F ~ (s 0 - s) / s 2 -> image spm_ess??? -> image of F : spm_F??? -> image of F : spm_F??? under the null hypothesis : F ~ F( p - p0, n-p) b 0 = (X X ) + X Y e 0 = Y - X 0 b 0 (e : estimate of ) s 2 0 = (e 0 e 0 /(n - p 0 )) (s : estimate of n: time, p : parameters) Estimation [Y, X 0 ] [b 0, s 0 ] Estimation [Y, X] [b, s] Y = X + ~ N(0, I) Y = X + ~ N(0, I) X : X Reduced

19 w T tests are simple combinations of the betas; they are either positive or negative (b1 – b2 is different from b2 – b1) T and F test: take home... w F tests can be viewed as testing for the additional variance explained by a larger model wrt a simpler model, or w F test the sum of the squares of one or several combinations of the betas w in testing single contrast with an F test, for ex. b1 – b2, the result will be the same as testing b2 – b1. It will be exactly the square of the t-test, testing for both positive and negative effects.

20 w Make sure we understand t and F tests Plan w Make sure we all know about the estimation (fitting) part.... w But what do we test exactly ? Correlation between regressors w An example – almost real

21 « Additional variance » : Again Independent contrasts

22 « Additional variance » : Again correlated regressors, for example green: subject age yellow: subject score Testing for the green

23 « Additional variance » : Again correlated contrasts Testing for the red

24 « Additional variance » : Again Entirely correlated contrasts ? Non estimable ! Testing for the green

25 « Additional variance » : Again If significant ? Could be G or Y ! Entirely correlated contrasts ? Non estimable ! Testing for the green and yellow

26 Less simple contrasts SAME PRINCIPLE APPLIES: Think of it as constructing 3 regressors from the 3 differences and complement this new design matrix such that data can be fitted in the exact same way (same error, same fitted data)

27 w Make sure we understand t and F tests Plan w Make sure we all know about the estimation (fitting) part.... w But what do we test exactly ? Correlation between regressors w An example – almost real

28 A real example (almost !) Factorial design with 2 factors : modality and category 2 levels for modality (eg Visual/Auditory) 3 levels for category (eg 3 categories of words) Experimental DesignDesign Matrix V A C1 C2 C3 C1 C2 C3 V A C 1 C 2 C 3

29 Asking ourselves some questions... V A C 1 C 2 C 3 Design Matrix not orthogonal Many contrasts are non estimable Interactions MxC are not modelled Test C1 > C2 : c = [ ] Test V > A : c = [ ] [ ] Test C1,C2,C3 ? (F) c = [ ] [ ] Test the interaction MxC ?

30 Modelling the interactions

31 Asking ourselves some questions... V A V A V A Test C1 > C2 : c = [ ] Test the interaction MxC : [ ] c =[ ] [ ] Design Matrix orthogonal All contrasts are estimable Interactions MxC modelled If no interaction... ? Model is too big ! C 1 C 1 C 2 C 2 C 3 C 3 Test V > A : c = [ ] Test the category effect : [ ] c =[ ] [ ]

32 Asking ourselves some questions... With a more flexible model V A V A V A Test C1 > C2 ? Test C1 different from C2 ? from c = [ ] to c = [ ] [ ] becomes an F test! C 1 C 1 C 2 C 2 C 3 C 3 Test V > A ? c = [ ] is possible, but is OK only if the regressors coding for the delay are all equal

33 Convolution model Design and contrast SPM(t) or SPM(F) Fitted and adjusted data

34 w F tests have to be used when - Testing for >0 and 0 and <0 effects - Testing for more than 2 levels - Conditions are modelled with more than one regressor Toy example: take home... w F tests can be viewed as testing for - the additional variance explained by a larger model wrt a simpler model, or - the sum of the squares of one or several combinations of the betas (here the F test b1 – b2 is the same as b2 – b1, but two tailed compared to a t-test).

35 w Make sure we understand t and F tests Plan w A bad model... And a better one w Make sure we all know about the estimation (fitting) part.... w A (nearly) real example w But what do we test exactly ? Correlation between regressors

36 A bad model... True signal and observed signal (---) Model (green, pic at 6sec) TRUE signal (blue, pic at 3sec) Fitting (b1 = 0.2, mean = 0.11) => Test for the green regressor not significant Residual (still contains some signal)

37 =+ Y X 1 = = 0.11 A bad model... Residual Variance = 0.3 P(Y| 1 = 0) => p-value = 0.1 (t-test) P(Y| 1 = 0) => p-value = 0.2 (F-test)

38 A « better » model... True signal + observed signal Global fit (blue) and partial fit (green & red) Adjusted and fitted signal => t-test of the green regressor significant => F-test very significant => t-test of the red regressor very significant Residual (a smaller variance) Model (green and red) and true signal (blue ---) Red regressor : temporal derivative of the green regressor

39 A better model... =+ Y X 1 = = = 0.11 Residual Var = 0.2 P(Y| 1 = 0) p-value = 0.07 (t-test) P(Y| 1 = 0, 2 = 0) p-value = (F-test)

40 Flexible models : Gamma Basis

41 w Test flexible models if there is little a priori information w The residuals should be looked at...! w In general, use the F-tests to look for an overall effect, then look at the response shape w Interpreting the test on a single parameter (one regressor) can be difficult: cf the delay or magnitude situation wBRING ALL PARAMETERS AT THE 2nd LEVEL Summary... (2)

42 Lunch ?

43 w Make sure we understand t and F tests w Correlation in our model : do we mind ? Plan w A bad model... And a better one w Make sure we all know about the estimation (fitting) part.... w A (nearly) real example ?

44 True signal Correlation between regressors Fit (blue : global fit) Residual Model (green and red)

45 =+ Y X 1 = = = 0.06 Correlation between regressors Residual var. = 0.3 P(Y| 1 = 0) p-value = 0.08 (t-test) P(Y| 2 = 0) p-value = 0.07 (t-test) P(Y| 1 = 0, 2 = 0) p-value = (F-test)

46 true signal Correlation between regressors - 2 Residual Fit Model (green and red) red regressor has been orthogonalised with respect to the green one remove everything that correlates with the green regressor

47 =+ Y X b 1 = 1.47 b 2 = 0.85 b3 = 0.06 Residual var. = 0.3 P(Y| 1 = 0) p-value = (t-test) P(Y| 2 = 0) p-value = 0.07 (t-test) P(Y| 1 = 0, 2 = 0) p-value = (F-test) See « explore design » Correlation between regressors ***

48 Design orthogonality : « explore design » Beware: when there are more than 2 regressors (C1,C2,C3,...), you may think that there is little correlation (light grey) between them, but C1 + C2 + C3 may be correlated with C4 + C5 Black = completely correlated White = completely orthogonal Corr(1,1)Corr(1,2)

49 w We implicitly test for an additional effect only, be careful if there is correlation Summary - Orthogonalisation = decorrelation : not generally needed - Parameters and test on the non modified regressor change w It is always simpler to have orthogonal regressors and therefore designs ! w In case of correlation, use F-tests to see the overall significance. There is generally no way to decide to which regressor the « common » part should be attributed to w Original regressors may not matter: its the contrast you are testing which should be as decorrelated as possible from the rest of the design matrix

50 Conclusion : check your models ! w Check your residuals/model - multivariate toolbox w Check group homogeneity - Distance toolbox Check your HRF form w Check your HRF form - HRF toolbox

51

52 Implicit or explicit (decorrelation (or orthogonalisation) Implicit or explicit ( decorrelation (or orthogonalisation) C1 C2 Xb Y Xb e Space of X C1 C2 L C2 : L C1 : test of C2 in the implicit model test of C1 in the explicit model C1 C2 Xb L C1 L C2 C2 C2 cf Andrade et al., NeuroImage, 1999 This generalises when testing several regressors (F tests)

53 X = Mean Cond 1 Cond 2 Y = Xb+e; ^^ completely correlated... Parameters are not unique in general ! Some contrasts have no meaning: NON ESTIMABLE c = [1 0 0] is not estimable (no specific information in the first regressor); c = [1 -1 0] is estimable; Y Xb e Space of X C1 C2 Mean = C1+C2


Download ppt "SPM short course – Mai 2007 Linear Models and Contrasts Jean-Baptiste Poline Neurospin,I2BM, CEA Saclay, France."

Similar presentations


Ads by Google