Download presentation

Presentation is loading. Please wait.

Published byMisael Hailey Modified over 2 years ago

1
Est+Opt CIRM 18/8/2009 A. ILIADIS 1 Estimation + optimization in PK modeling Introduction to modeling Estimation criteria Numerical optimization, examples

2
Est+Opt CIRM 18/8/2009 A. ILIADIS 2 Real process and mathematical model Fitted model Fitted model Real process Math. model

3
Est+Opt CIRM 18/8/2009 A. ILIADIS 3 Functional scheme Measurement noise Measurement noise PK model Equivalence criterion Equivalence criterion Nonlinear programming Nonlinear programming Administration protocol Administration protocol A priori information A priori information + Observation Prediction PK process

4
Est+Opt CIRM 18/8/2009 A. ILIADIS 4 Mathematical modeling Models are defined by : - their structure ( number and connectivity of compartments, etc ) expressed by mathematical operations involving adjustable parameters : Ex :1-cpt, exponential structure, parameters : - the numerical value of parameters used : CHARACTERIZATION Structure CHARACTERIZATION Structure MODELING System Identification MODELING System Identification ESTIMATION Parameters ESTIMATION Parameters +

5
Est+Opt CIRM 18/8/2009 A. ILIADIS 5 Checking identifiability Structural identifiability : given a hypothetical structure with unknown parameters, and a set of proposed experiments (not measurements !), would the unknown parameters be uniquely determinable from these experiments ? Parametric identifiability : estimating the parameters from measurements with errors and optimize sampling designs. : structural non-identifiable Non-consistent estimate

6
Est+Opt CIRM 18/8/2009 A. ILIADIS 6 Structural identifiability It depends on the observation site ! Solutions : Grouping : But ONLY identifiable parameters : Setting :

7
Est+Opt CIRM 18/8/2009 A. ILIADIS 7 Functional scheme (dynamic) Linear / p model : no loop, one stage estimation. Nonlinear / p model : many loops until convergence. Measurement noise Measurement noise PK model Equivalence criterion Equivalence criterion Nonlinear programming Nonlinear programming Administration protocol Administration protocol A priori information A priori information + PK process Arbitrary initial values Arbitrary initial values

8
Est+Opt CIRM 18/8/2009 A. ILIADIS 8 Iterations, parameter convergence Ex : Fotemustine neutrophil toxicity : Nonlinear modeling : Optimized final values Arbitrary initial values

9
Est+Opt CIRM 18/8/2009 A. ILIADIS 9 Errors in the functional scheme The existing errors : experimental, structural, parametric. Residual error : experimental, structural (model misspecification). Residual error Initial parametric error (canceled at the convergence)

10
Est+Opt CIRM 18/8/2009 A. ILIADIS 10 Parametric and output spaces Observation Comparison Parametric space Output space Prediction PK process PK model Real process Artificial mechanism Random component Precision of estimates Measurement error

11
Est+Opt CIRM 18/8/2009 A. ILIADIS 11 Optimal estimation Estimation is the operation of assigning a numerical values to unknown parameters, based on noise-corrupted observations. Organization of the variables : The observed drug concentrations over time, ( dimensional vector). The random parameters to be estimated, ( dimensional vector). Consider the joint pdf and then : the marginal is called prior pdf [ the marginal is not of interest ]. the conditional is called posterior pdf : the conditional leads to the likelihood function : MAP MLE

12
Est+Opt CIRM 18/8/2009 A. ILIADIS 12 Maximum a posteriori (MAP) Design : A reasonable estimate of would be the mode of the posterior density for the given observation : Ex : if The role of the dispersion.

13
Est+Opt CIRM 18/8/2009 A. ILIADIS 13 Maximum likelihood (MLE) Design : After the observation has been obtained, a reasonable estimate of would be, the value which gives to the particular observation the highest probability of occurrence : Ex : if The role of the precision.

14
Est+Opt CIRM 18/8/2009 A. ILIADIS 14 The influence of x on the conditional

15
Est+Opt CIRM 18/8/2009 A. ILIADIS 15 The influence of y on the likelihood

16
Est+Opt CIRM 18/8/2009 A. ILIADIS 16 MLE criterion for single - output Initial form : Hypotheses : H0 : The model is an exact description of the real process. Error Output H1 : Additive error : H2 : Normal error : H3 : Independence :

17
Est+Opt CIRM 18/8/2009 A. ILIADIS 17 Variance heterogeneity The regression model : assumes that Need to relax this assumption (particularly when the model is highly nonlinear). Transformed models Find a transformation function under which the error assumptions hold, i.e. : where Box – Cox transformations : Other transformations : John – Drapper, Carroll, Huber, etc. Estimate !

18
Est+Opt CIRM 18/8/2009 A. ILIADIS 18 General form of the MLE criterion For available observed data and under the H3 hypothesis the estimator becomes : Where : if is the criterion function to be minimized. The 1st term is known as the extended SE term. The 2nd term is called the weighted SE term. It is the only one involving observed data and it is weighted by the uncertainty of experiment.

19
Est+Opt CIRM 18/8/2009 A. ILIADIS 19 Criterion and error variance model After introducing the error variance model : is minimum along the direction when : or with Then : Nonlinearly unconstrained optimization : Find : Assumptions :is computable for all and analytic solution does not exist.

20
Est+Opt CIRM 18/8/2009 A. ILIADIS 20 Iterative solutions Solution for the nonlinear optimization problem Sequentially approximate starting from an initial value and converging towards a stationary point. Design a routine algorithm generating the converging sequence : Terminology : is the initialization and obtaining from is an iteration. Assign :

21
Est+Opt CIRM 18/8/2009 A. ILIADIS 21 Taylor's expansion for smooth multivariate function: Construct simple approximations of a general function in a neighborhood of the reference point. With a vector of unit length supplying the direction of search, and a scalar specifying the step length: Associate successive approximations to iterations : Approximation of functions

22
Est+Opt CIRM 18/8/2009 A. ILIADIS 22 Direction of search Linear approx. of : The scalar gives the rate of change of at the point along the direction. To reduce, move along the direction opposite to : the descent direction. Quadratic approx. of : The scalar involves the second derivative of. characterizes an ellipse. To reduce, move along the direction targeting the center of ellipse : : the Newton direction

23
Est+Opt CIRM 18/8/2009 A. ILIADIS 23 Line search Minimization directions : moving along the Newton direction for quadratic surfaces, near. Elsewhere, move along the descent direction. Line search : to complete the iteration search for in the direction of search : or Minimization direction

24
Est+Opt CIRM 18/8/2009 A. ILIADIS 24 Families of algorithms Practical : Approximate derivatives by finite-differences instead analytical calculation. Classify : Twice-continuously differentiable : Second derivative : quadratic model of, compute and invert (not numerically stable, time consuming processing). First derivative : quadratic model of, approximate : without inverting but directly from by finite-differences of. quasi-Newton methods (appropriate in many circumstances) : BFGS, DFP,... Non-derivative : linear model of. Approximate by finite-differences of (for smooth functions) : Powell, Brent,... No assumptions on differentiability : heuristic algorithms : NMS, Hooke-Jeeves,...

25
Est+Opt CIRM 18/8/2009 A. ILIADIS 25 The information matrix The Fisher information matrix : For MLE estimation : Cramér-Rao inequality : is a lower bound of the covariance matrix, evaluating the precision of. In practice : With the vector of the sampling times, Obtain the sensitivity matrix with elements, Set and the order diagonal matrices having as elements and respectively.

26
Est+Opt CIRM 18/8/2009 A. ILIADIS 26 Covariance (precision) matrix The order precision matrix is : Dependence on the sampling protocol: Graphic interpretation of the precision matrix : is symmetric, and, if, it is also definite positive (by construction). If, then is an a dimensional ellipsoid. The volume of the ellipsoid is : Sums of weighted products of sensitivity functions over the available sampling times The lowest, the most precise

27
Est+Opt CIRM 18/8/2009 A. ILIADIS 27 Check the structural identifiability The sensitivity matrix depends : On the experiment (not measurements !) and On the model parametrization (structural and parametric). Ensure definite-positivity of the sensitivity matrix : It must be of full rank for any numeric value of the parameters, e.g., for the arbitrary initial values (several). Ex : Observation in the central cptfree # central cpt fixed #peripheral cpt ? ?

28
Est+Opt CIRM 18/8/2009 A. ILIADIS 28 Simulation in optimization 2-cpt model : Administration : IV bolus Observation : Horizon nbr Heteroscedastic

29
Est+Opt CIRM 18/8/2009 A. ILIADIS 29 Performances of algorithms

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google