Presentation is loading. Please wait.

Presentation is loading. Please wait.

July 31, 2013 Jason Su. Background and Tools Cramér-Rao Lower Bound (CRLB) Automatic Differentiation (AD) Applications in Parameter Mapping Evaluating.

Similar presentations


Presentation on theme: "July 31, 2013 Jason Su. Background and Tools Cramér-Rao Lower Bound (CRLB) Automatic Differentiation (AD) Applications in Parameter Mapping Evaluating."— Presentation transcript:

1 July 31, 2013 Jason Su

2 Background and Tools Cramér-Rao Lower Bound (CRLB) Automatic Differentiation (AD) Applications in Parameter Mapping Evaluating methods Protocol optimization

3 How precisely can I measure something with this pulse sequence?

4 A lower limit on the variance of an estimator of a parameter. – The best you can do at estimating say T 1 with a given pulse sequence and signal equation: g(T 1 ) Estimators that achieve the bound are called “efficient” – The minimum variance unbiased estimator (MVUE) is efficient

5

6

7

8 Questionable accuracy Numeric differentiation Has limited the application of CRLB Difficult, tedious, and slow for multiple inputs, multiple outputs Symbolic or analytic differentiation Solves all these problems Calculation time comparable to numeric But 10 8 times more accurate Automatic differentiation

9 The most criminally underused tool in your computational toolbox?

10 Automatic differentiation is NOT: – Analytic differentiation

11 Automatic differentiation is NOT: – Analytic differentiation – Symbolic differentiation syms x1 x2; f = 1/(1 + exp(-x1/x2)); df_dx1 = diff(f, x1) >> 1/(x2*exp(x1/x2)*(1/exp(x1/x2) + 1)^2)

12 Automatic differentiation is NOT: – Analytic differentiation – Symbolic differentiation – Numeric differentiation (finite difference) f = @(x1, x2) 1/(1 + exp(-x1/x2)); eps = 1e-10; df_dx1 = f(x1+eps, x2) – f(x1, x2)) df_dx1 = df_dx1/eps

13 Automatic differentiation IS: – Fast, esp. for many input partial derivatives Symbolic requires substitution of symbolic objects Numeric requires multiple function calls for each partial

14 Automatic differentiation IS: – Fast, esp. for many input partial derivatives – Effective for computing higher derivatives Symbolic generates huge expressions Numeric becomes even more inaccurate

15 Automatic differentiation IS: – Fast, esp. for many input partial derivatives – Effective for computing higher derivatives – Adept at analyzing complex algorithms Bloch simulations Loops and conditional statements 1.6 million-line FEM model

16 Automatic differentiation IS: – Fast, esp. for many input partial derivatives – Effective for computing higher derivatives – Adept at analyzing complex algorithms – Accurate to machine precision

17 Some disadvantages: – Exact details of the implementation are hidden – Hard to accelerate

18 Numeric: implement definition of derivative Symbolic: N-line function -> single line expression Automatic: N-line function -> M-line function – A technology to automatically augment programs with statements to compute derivatives

19

20 f = @(x1, x2) 1/(1 + exp(-x1/x2)); Find the subroutine: df_dx(x1, x2)Original Code Added statements for derivatives Start with the inputsx1 x2 x1’ = 1 x2’ = 0

21 f = @(x1, x2) 1/(1 + exp(-x1/x2)); Find the subroutine: df_dx(x1, x2)Original Code Added statements for derivatives Start with the inputsx1 x2 x1’ = 1 x2’ = 0 Define intermediate vars and apply chain rule w3 = -x1 w4 = 1/x2 w3’ = -x1’ = -1 w4’ = -x2’/x2 2 = 0

22 f = @(x1, x2) 1/(1 + exp(-x1/x2)); Find the subroutine: df_dx(x1, x2)Original Code Added statements for derivatives Start with the inputsx1 x2 x1’ = 1 x2’ = 0 Define intermediate vars and apply chain rule w3 = -x1 w4 = 1/x2 w3’ = -x1’ = -1 w4’ = -x2’/x2 2 = 0 w5 = w3*w4 = -x1/x2w5’ = w3*w4’ + w3’*w4 w5’ = -w4 = -1/x2

23 f = @(x1, x2) 1/(1 + exp(-x1/x2)); Find the subroutine: df_dx(x1, x2)Original Code Added statements for derivatives Start with the inputsx1 x2 x1’ = 1 x2’ = 0 Define intermediate vars and apply chain rule w3 = -x1 w4 = 1/x2 w3’ = -x1’ = -1 w4’ = -x2’/x2 2 = 0 w5 = w3*w4 = -x1/x2w5’ = w3*w4’ + w3’*w4 w5’ = -w4 = -1/x2 w6 = 1 + exp(w5)w6’ = w5’*exp(w5) w7 = 1/w6w7’ = -w6’/w6 2

24 f = @(x1, x2) 1/(1 + exp(-x1/x2)); Find the subroutine: df_dx(x1, x2)Original Code Added statements for derivatives Start with the inputsx1 x2 x1’ = 1 x2’ = 0 Define intermediate vars and apply chain rule w3 = -x1 w4 = 1/x2 w3’ = -x1’ = -1 w4’ = -x2’/x2 2 = 0 w5 = w3*w4 = -x1/x2w5’ = w3*w4’ + w3’*w4 w5’ = -w4 = -1/x2 w6 = 1 + exp(w5)w6’ = w5’*exp(w5) w7 = 1/w6w7’ = -w6’/w6 2

25 f = @(x1, x2) 1/(1 + exp(-x1/x2)); Find the subroutine: df_dx(x1, x2)Original Code Added statements for derivatives Start with the inputsx1 x2 x1’ = 1 x2’ = 0 Define intermediate vars and apply chain rule w3 = -x1 w4 = 1/x2 w3’ = -x1’ = -1 w4’ = -x2’/x2 2 = 0 w5 = w3*w4 = -x1/x2w5’ = w3*w4’ + w3’*w4 w5’ = -w4 = -1/x2 w6 = 1 + exp(w5)w6’ = w5’*exp(w5) w7 = 1/w6w7’ = -w6’/w6 2

26 Applications – Gradient-based optimization methods – Uncertainty propagation – Transparent calculation of the Jacobian of a multiple- input, multiple-output function Packages – MATLAB ADiMat AD for MATLAB, Adiff – Python pyautodiff uncertainties, algopy, CasADi

27

28 1.Start with a signal model for your data 2.Collect a series of scans, typically with only 1 or 2 sequence variables changing 3.Fit model to data Motivation – Reveals quantifiable physical properties of tissue unlike conventional imaging – Maps are ideally scanner independent

29 Some examples – FA/MD mapping with DTI – most widely known mapping sequence – T 1 mapping – relevant in study of contrast agent relaxivity and diseases – B 1 mapping – important for high field applications

30 T 1 mapping – IR SE – gold standard, vary TI – Look-Locker – use multiple readout pulses to collect many TIs – DESPOT1 – vary flip angle T 2 mapping – Dual SE – vary TE – CPMG – use multiple spin echoes to collect many TEs – DESPOT2 – vary flip angle

31 T 1 mapping methods – Spin-echo inversion recovery – Look-Locker – DESPOT1/VFA – MPnRAGE family

32 T 1 mapping methods – Spin-echo inversion recovery – Look-Locker – DESPOT1/VFA – MPnRAGE family

33 Protocol optimization – What is the acquisition protocol which best maximizes our T 1 precision? Christensen 1974, Homer 1984, Wang 1987, Deoni 2003

34

35

36

37

38

39 More protocol optimization – DESPOT2-FM: free parameters incl. SPGR or bSSFP, αs, phase-cycle – mcDESPOT: precision of MWF has recently been under question (Lankford 2012) Exploration of other pulse sequences Comparison of competing methods

40 Cramér-Rao Lower BoundAutomatic DifferentiationProtocol optimization of DESPOT1

41 Slides available at http://mr.jason.su/ Python source code available soon


Download ppt "July 31, 2013 Jason Su. Background and Tools Cramér-Rao Lower Bound (CRLB) Automatic Differentiation (AD) Applications in Parameter Mapping Evaluating."

Similar presentations


Ads by Google