Download presentation

Presentation is loading. Please wait.

Published byElfrieda Foster Modified over 4 years ago

1
Cox Model With Intermitten and Error-Prone Covariate Observation Yury Gubman PhD thesis in Statistics Supervisors: Prof. David Zucker, Prof. Orly Manor

2
Introduction Regression analysis of right-censored survival data commonly arises in many fields, especially in medical science. The most popular survival data regression model is the Cox (1972) proportional hazard model, in which the hazard function for an individual with covariate vector X is modeled as

3
Introduction In many cases, the covariate X is not measured exactly. Instead of X, we observe a surrogate measure W, which is subject to error. Measurement error in the covariates has three main effects: a) causes bias in parameter estimation for statistical models; b) leads to a loss of power, sometimes profound, for detecting covariate effects; c) masks the features of the data, making graphical model analysis difficult. In addition, although in theory the covariate is a continuous process in time, in practice measurements are taken only over a discrete grid of timepoints (every 6 months, once a year, etc.). This discrepancy may also lead to a bias.

4
Introduction The classical measurement error model for individual i and measurement j can be written as: where measurement error term is independent of X i with zero mean given X i. Independence across i is also assumed. Let n be the number of individuals and J the number of observations (we assume that J is the same for all individuals), and suppose the measurements of the surrogate covariate are taken at timepoints.

5
Introduction Tsiatis and Davidian (2004) have suggested a so-called joint model, where models for event time distribution and longitudinal data depend on a common set of latent random variables. Andersen and Liestol (2003) proposed a simple approach based on the regression calibration idea. The bias due to observing the longitudinal covariate process over a discrete grid is handled by introducing additional variables to the standard Cox model, while measurement error is treated using an external procedure. Most of the proposed approaches are limited to some special cases or/and specific distribution assumptions.

6
Proposed Method We propose an approach that is of intermediate complexity relative to the simple approach of Andersen and Liestol and the joint modeling approach. We assume additive model for measurement error. The error term is independent across i and j, and independent of all other random variables in the model. We do not assume a specific parametric distribution of ε ij. We assume a working parametric model F for the conditional distribution, t 2 > t 1. Note that we do not assume that the data is actually distributed according to F, but rather than F is a close enough approximation to yield reasonable estimators of Cox coefficients.

7
Proposed Method Assume that Moment Generating Function (MGF) of F exists and is well-defined. To cover a variety of cases, flexible distributions may be used, such as the Semi-Nonparametric distribution (SNP) of Gallant and Nychka, 1987.

8
Proposed Method To start with, assume that (no measurement error). Define:, where T i - event time, δ i is the event indicator. It follows, that hazard function may be represented as: where Y(t) is an indicator to be at risk at time t.

9
Proposed Method The above expression may be approximated by the moment generating function (MGF) of F. It follows that approximated hazard is given by: where is a parameters vector of F estimated at t, is a nearest timepoint before t at which W i is availible. We need to evaluate at every timepoint t. However, we can see W i only at.

10
Proposed Method The discrete grid problem is treated by introducing working models for each central moment m k : In the above expression, g k is some function (chosen by numerical reasons), and Slope hist is a slope of the historical data (before t 1 ). θ k are estimated using all available data at the observed timepoints τ q, τ p ( τ q < τ p, conditioning on being at risk at τ q ). OLS technique is applied.

11
Proposed Method Using estimated coefficients of the working model in the previous slide: Solve: Given these estimated moments, the formulas for F moments can be backsolved and may by calculated for every t. Given the above, the MGF is defined at every point, and the hazard can be calculated for every t > s. The estimator for is obtained using the Cox partial likelihood, incorporating the proposed hazard function.

12
Proposed Method Note that: Cox partial likelihood is given in this case by: Variance of is estimated using Weighted Bootstrap approach.

13
Simulation Study Data simulation is based on the setting of Andersen and Liestol (2003), patterned after a clinical trial studying the effect of prednisone treatment versus placebo on survival with liver cirrhosis (Christensen, 1986). The true data is simulated from the model: where t denotes a common trend of the form, A i represents initial variation between individuals, and is a stochastic process representing changes in the covariate over time. We assume the and the measurement error is normal with zero mean and variance.

14
Simulation Study Following Andersen and Liestol (2003), we take U i (t) to be either a Brownian motion (BM) process or an Ornstein– Uhlenbeck (OU) process with correlation parameter 0.282. The trend term was taken to be a linear function with initial level of 72 and a decrease of 5 units per year. In the paper, Cox regression parameter is: β=-0.04. Sample size is 300, and 12 observations are available, every half a year (total trial length is 6 years). Failure times were simulated from a Weibull hazard. Each result was obtained based on 500 simulation runs.

15
Results Estimation of β, W(t) is measured without error

16
Results Estimation of β, W(t) is measured with error,

17
Results Variance estimation by Weighted Bootstrap Proposed method with SNP approximation,

18
Conclusions We propose a new semiparametric estimation approach for the Cox regression model when covariates are measured intermittently and with error. The intermittent measurement issue is handled by modeling the parameters of the distribution of the covariate among individuals still at risk as a function of time. The relative risk is then computed using the MGF. The accuracy of the proposed estimators depends critically on the form of the OLS working model for in- between times. Increasing the accuracy of estimates by assuming more flexible interpolation model is a topic for future research.

19
Conclusions In a simulation study we found that in most cases the proposed method provides reasonable estimates for the Cox regression parameter and its standard deviations. Because the SNP model covers a range of distributional shapes, the method can be applied in a range of settings. The computational burden is moderate – less than one minute for one run for the SNP - based procedure.

Similar presentations

© 2019 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google