Presentation is loading. Please wait.

Presentation is loading. Please wait.

INCREASING THE TRANSPARENCY OF CEA MODELING ASSUMPTIONS: A SENSITIVITY ANALYSIS BASED ON STRENGTH OF EVIDENCE RS Braithwaite MS Roberts AC Justice.

Similar presentations


Presentation on theme: "INCREASING THE TRANSPARENCY OF CEA MODELING ASSUMPTIONS: A SENSITIVITY ANALYSIS BASED ON STRENGTH OF EVIDENCE RS Braithwaite MS Roberts AC Justice."— Presentation transcript:

1 INCREASING THE TRANSPARENCY OF CEA MODELING ASSUMPTIONS: A SENSITIVITY ANALYSIS BASED ON STRENGTH OF EVIDENCE RS Braithwaite MS Roberts AC Justice

2 Introduction Tragicomic anecdote

3 Introduction Policy makers/clinicians reluctant to use CEA because assumptions difficult to understand Using Cost-Effectiveness Analysis to Improve Health Care: Opportunities and Barriers. Neumann PJ 2005 CMS (26th National meeting of SMDM, 2004) CEA modelers may base parameter estimates on studies that have limited evidence. Modelers may not consider all studies with comparable evidence and applicability

4 Objective To develop a method to clarify the tradeoff between strength of evidence and precision of CEA results.

5 Methods Proof of concept based on hypothetical data and simplified model of HIV natural history. Question: What is the cost-effectiveness of Directly Observed Therapy (DOT) for HIV patients?

6 Methods Basic idea When data sources have insufficient strength of evidence, we should no longer use them to estimate model parameters. Instead, we should assume that little is known and specify them using wide probability distributions with the fewest embedded assumptions Uniform distribution

7 Methods Assess strength of evidence based on USPTF guidelines which specify three valuation domains Study design Extent to which design differs from controlled experiment Level 1 = best (RCT) Level 3=worst (expert opinion, anecdotal evidence) Internal validity Extent to which results represent truth in study population Good = best (little LTFU, objective assessment) Poor = worst (large or diverging LTFU, subjective assessment) External validity Extent to which results represent truth in target population High = best (similar pt characteristics, care settings) Low = worst (dissimilar pt characteristics, care settings)

8 Methods Vary evidence criteria in 3 domains from most to least inclusive Individually and in aggregate If evidence meets or exceeds criteria, use it to estimate parameter input distribution If evidence does not meet criteria, do not use it Use uniform distribution over plausible range sufficiently wide to be acceptable to all CEA users

9 Methods For natural history parameters that can only be observed rather than determined experimentally observational studies eligible for Level 1 design Overall mortality rate due to age-, sex-, and race-related causes When more than one source of evidence met criteria, we used that source with greatest statistical precision Alternative: pool weighting by inverse of variance When substituting uniform distribution make sure that direction of aggregate effect is neutral Maximizes conservatism of approach

10 Methods Model: extremely simple 10-parameter probabilistic simulation of DOT in HIV 17 data sources considered

11 Results Base Case: No evidence criteria Study Design = High
All 17 data sources eligible for parameter estimation Study Design = High 13 out of 17 sources were eligible Internal Validity = Good 9 out of 17 sources were eligible External Validity = High 5 out of 17 sources were eligible All three criteria Only 3 out of 17 sources were eligible

12 Results: All Evidence

13 Results: Design = 1

14 Results: Internal Validity = Good

15 Results: External Validity = High

16 Results: All Evidence

17 Results: Design = 1

18 Results: Internal Validity = Good

19 Results: External Validity = High

20 Results – Overall No evidence criteria $78,000/QALY
Study Design = $227,000/QALY Internal Validity = Good $158,000/QALY External Validity = High >$6,000,000/QALY All three criteria > $6,000,000/QALY

21 Limitations Incorporates a simple model of HIV that was constructed solely for the purpose of illustrating proof of concept. Method is likely to need further refinement before it could be used on more complex and realistic simulations. Method only addresses parameter uncertainty, leaving other determinates of modeling uncertainty unexplored.

22 Conclusions Strength of evidence may have profound impact on the precision and estimates of CEAs With all evidence was permitted results similar to previously published DOT CEA (Goldie03) $40,000 to $75,000/QALY Little uncertainty With stricter evidence criteria our results differed markedly > $ 150,000/QALY Great uncertainty

23 Implications Sensitivity analysis by strength of evidence concept can be linked to any desired ranking method for strength of evidence, and therefore can be customized to facilitate its use by expert panels and organizations. Advance of this work does not lie in its specification of particular hierarchy of strength of evidence Advance lies in showing how any hierarchy can be implemented within CEA model.

24 Implications Users who think “any data is better than no data” will likely base inferences on model results that incorporate all data sources, regardless of strength of evidence Users who think “my judgment supersedes all but the best data” would likely only base inferences on model results that reflect only highest grades of evidence. Many models may fail to provide conclusive results when validity criteria are stringent. Nonetheless, in the long run this may help CEA to become a more essential decision making tool.

25


Download ppt "INCREASING THE TRANSPARENCY OF CEA MODELING ASSUMPTIONS: A SENSITIVITY ANALYSIS BASED ON STRENGTH OF EVIDENCE RS Braithwaite MS Roberts AC Justice."

Similar presentations


Ads by Google