Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Regulatory Perspective on Threats to the Integrity of Analgesic Clinical Trial Efficacy Data Sharon Hertz, MD Division Director Division of Anesthesia,

Similar presentations


Presentation on theme: "A Regulatory Perspective on Threats to the Integrity of Analgesic Clinical Trial Efficacy Data Sharon Hertz, MD Division Director Division of Anesthesia,"— Presentation transcript:

1 A Regulatory Perspective on Threats to the Integrity of Analgesic Clinical Trial Efficacy Data Sharon Hertz, MD Division Director Division of Anesthesia, Analgesia, and Addiction Products FDA/CDER

2 Disclaimer The content of this talk does not necessarily reflect the views of the FDA, and is entirely based on my own observations and viewpoints. I have no potential conflicts of interest to report. 2

3 TIACTED Errors in the design, the conduct, the data collection process, and the analysis of a randomized trial have the potential to affect not only the safety of the patients in the trial, but also, through the introduction of bias, the safety of future patients.* * Colin Baigent, Frank E Harrell, Marc Buyse, Jonathan R Emberson and Douglas G Altman, Ensuring trial validity by data quality assurance and diversification of monitoring methods, Clinical Trials 2008; 5: 49–55 3

4 TIACTED Beyond the potential to affect safety, threats to clinical trial data integrity affect the ability to demonstrate efficacy and can substantially increase the time to get new products to market. 4

5 What Are TIACTED? Inadequate study design Sloppy study conduct –Poor training/supervision of clinical site staff, patients –Protocol violations by clinical site staff, patients –Unverifiable data/audit trail 5

6 What Are TIACTED? Intentional actions that negatively affect clinical trial data integrity –Deceptive subjects –Fraudulent data –Intentional failure to adhere to protocol –Improper handing of data –Deviation from prespecified analyses 6

7 Example 1 - Investigator Fraud? StudyStudy 1Study 2Study 3 Treatment Study Drug PlaceboStudy Drug PlaceboStudy Drug Placebo PID, VAS24 hours72 hours Mean ± SD-46 ± 22-13. ± 13-57 ± 16-20 ± 12-31 ± 21 Difference from placebo LS Mean (95% C.I.) -32 (-37, -28)-36 (-40, -32)-0.7 (-5, 3) p-value <0.0001 0.76 Study LocationNon-US US 7 3 clinical efficacy trials with very similar design: 2 successful, 1 failed

8 Example 1 Successful studies –Large effect size, larger than expected –Higher baseline pain intensity –Less use of rescue, non-drug treatment –Lower placebo response Smaller change PI, 0% placebo reported onset of meaningful PR 8

9 Example 1 What could explain the difference? –Demographics mostly similar –Looked for treatment by site effect – results not driven by one site –Compared to other similar product trial results including US and other non-US trials – no other studies with similar placebo response or effect size 9

10 Example 1 Site Inspections of 3 sites common to both non-US studies, based on high enrollment numbers –1 site Study nurse transcribed PI notes “to be legible”, destroyed original documents 21 subjects enrolled in both studies, 14 of whom injured and enrolled on same day, for both studies 17 of 55 in study 1, 6 of 35 in study 2 - part of a pair or triplet with same surname and/or address and many with same day of injury Site excluded from analysis 10

11 Example 1 Findings from site 1 led to evaluation for similar patterns of enrollment from other sites –All sites had some same-day enrollment from related subjects or subjects sharing an address, and multiple subjects enrolled in both studies –Applicant explained that multiple members of the same family or household could sustain the same injury, on the same day, repeatedly, because people in this country more active than US –Unable to verify identity of any subjects based on local privacy laws 11

12 Example 2 – Failure to Follow Protocol 2 clinical efficacy trials, one single site for both studies Inspection findings –Failure to record safety variables, investigator felt protocol required too frequent recording of vital signs (although research assistant present to record dosing) –No automated blood pressure machine available for baseline measures 12

13 Example 2 Additional data requested Possible safety problems, data insufficient to adequately characterize safety 13

14 Example 3 – Improper Handling of Data 1 st review cycle – routine inspections – –Protocol deviations that could impact the validity, reliability, and integrity of data –Applicant failed to report protocol violations in final study report –Accidental unblinding at several sites 14

15 Example 3 2 nd review cycle - pivotal study repeated, routine inspections of 2 clinical sites and applicant –Statisticians extracted data for SAS datasets with unblinded treatment assignment field prior to database lock –Variable subsequently blinded –Datasets (not actual data) deleted –Applicant claimed statisticians either did not view data or had no interaction with sites or the critical outcome data 15

16 Example 3 Applicant failed to notify FDA when event occurred, even though unblinding contributed to initial CR Failed to maintain audit trails for the deletion of datasets FDA unable to confirm attestations of statisticians, no longer with company 16

17 End Note Three examples demonstrated importance of early identification of data integrity problems, corrections may salvage study Better clinical trial monitoring may help identify problems earlier Important to notify FDA, may be able to help Best approach – avoid these problems 17


Download ppt "A Regulatory Perspective on Threats to the Integrity of Analgesic Clinical Trial Efficacy Data Sharon Hertz, MD Division Director Division of Anesthesia,"

Similar presentations


Ads by Google