Presentation is loading. Please wait.

Presentation is loading. Please wait.

Challenges in Classifying Adverse Events in Cancer Clinical Trials Steven Joffe, MD, MPH Dave Harrington, PhD David Studdert, JD, PhD Saul Weingart, MD,

Similar presentations


Presentation on theme: "Challenges in Classifying Adverse Events in Cancer Clinical Trials Steven Joffe, MD, MPH Dave Harrington, PhD David Studdert, JD, PhD Saul Weingart, MD,"— Presentation transcript:

1 Challenges in Classifying Adverse Events in Cancer Clinical Trials Steven Joffe, MD, MPH Dave Harrington, PhD David Studdert, JD, PhD Saul Weingart, MD, PhD Damiana Maloof, RN

2 Disclosure Member of clinical trial adverse event review board for Genzyme Corp (not oncology-related)

3 Adverse Events in Clinical Trials Adverse events (AEs) are critically important outcomes of clinical trials –Human subjects protection –Endpoints for judgments about benefits & risks of study interventions Captured on Case Report Forms Reported to oversight agencies

4 Components of AE Assessment Type Severity Relatedness to study agent(s) Expectedness

5 Components of AE Assessment Type Severity Relatedness to study agent(s) Expectedness Global judgment about reportability to IRB

6 Reporting Criteria (to Dana-Farber IRB) Grade 5 (fatal) Grade 4, unless specifically exempted Grade 2/3, if unexpected AND possibly, probably or definitely related Virtually identical to NCI’s Adverse Event Expedited Reporting System (AdEERS) criteria

7 AE Grading in Oncology NCI’s Common Terminology Criteria for Adverse Events (CTCAE) typically used –Effort to standardize nomenclature –developed by consensus methods; no formal process to establish reliability of grading http://ctep.cancer.gov/protocolDevelopment/electronic_applications/ctc.htm#ctc_v30

8 Aims 1.To assess the validity of physician reviewers’ determinations about whether AEs in cancer trials meet IRB reporting criteria 2.To assess the interrater reliability of reviewers’ determinations about whether AEs that occur in cancer trials meet IRB reporting criteria 3.To assess the validity and reliability of revie- wers’ judgments about the components of AEs

9 Study Methods

10 Panelists’ Roles Review primary data from criterion sets of AEs Rate each AE: –Classification –Grade –Relatedness –Expectedness –Reportable to IRB } from CTCAE

11 Panelist Demographics Expert Panel (n=3) Second Panel (n=10) Years since fellowship training Mean20 yrs6.3 yrs Range10 – 32 yrs2 – 17 yrs Academic rank Instructor / Asst Prof110 Assoc Prof / Prof20

12 Panelists’ Experience Expert Panel (n=3) Second Panel (n=10) Clinical trials served as overall Principal Investigator 0 – 507 ≥ 633 Clinical trials served as PI, site PI, or Co-Investigator 0 – 502 6 – 2003 >2035

13 Panelists’ Experience Expert Panel (n=3) Second Panel (n=10) Patients personally enrolled in a clinical trial during past 3 years 0 – 1001 11 – 3014 >3025 Adverse event reports personally filed with the IRB during past 3 years 0 – 1016 11 – 3001 >3023

14 Statistical Analysis Validity of judgments regarding reportability to IRB –% agreement with gold standard Interrater reliability of raters’ judgments –Kappa coefficients

15 Results

16 Criterion Set of AEs Type of AEGrad e RelatedExpectedReportable High triglycerides4DefiniteYY Osteonecrosis3DefiniteYN Sensory neuropathy1ProbableYN Cardiac ischemia4PossibleYY Rash2ProbableYN Thrombosis4UnlikelyNY High uric acid4ProbableNY Cardiac dysfunction2DefiniteYN Thrombotic thrombo- cytopenic purpura 4PossiblyNY Renal failure4DefiniteYY

17 Validity of Judgments Regarding Reportability to IRB Adverse EventNot ReportableReportable% Agree 1. High triglycerides010100 2. Osteonecrosis6460 3. Sensory neuropathy100100 4. Cardiac ischemia010100 5. Rash9190 6. Thrombosis010100 7. High uric acid010100 8. Cardiac dysfunction8280 9. TTP010100 10. Renal failure010100 TOTAL93%

18 Interrater Reliability of Panelists’ Judgments JudgmentKappaP value Reportability0.75<0.0001 Grade0.52<0.0001 Relatedness0.22<0.0001 Expectedness0.88<0.0001

19 Role of Experience: Rank Kappa

20 Role of Experience: Service as PI Kappa

21 Role of Experience: Number of AE Reports Filed Kappa

22 Conclusions Oncologists’ judgments about whether or not AEs require reporting to the IRB show high agreement with gold standard Interrater reliability of oncologists’ judgments about components of AEs varies –High: expectedness of AE; need for reporting –Moderate: grade of AE –Low: relationship of AE to study agents

23 Limitations Small sample sizes –Criterion set of AEs –Panel of physician reviewers Generalizability of set of AEs Reviewers may not reflect population of investigators who file AE reports Judgments based on document review rather than on firsthand knowledge

24 Thoughts About Direction of Bias in Agreement Statistics Factors biasing towards less agreement –Reviewer experience Factors biasing towards greater agreement –Standardized set of documents for review –Criterion set selected based on maximum agreement among expert panel reviewers

25 Implications Judgments about AEs are complex –Human subjects: e fforts to enhance reliability, or to minimize reliance on judgments about causation, are needed –Science: toxicity data from uncontrolled trials may be misleading –RCR: education about need for reporting is important but insufficient

26 Acknowledgments Debra Morley Anna Mattson-DiCecca Physician panelists ORI NCI Milton Fund


Download ppt "Challenges in Classifying Adverse Events in Cancer Clinical Trials Steven Joffe, MD, MPH Dave Harrington, PhD David Studdert, JD, PhD Saul Weingart, MD,"

Similar presentations


Ads by Google