Presentation is loading. Please wait.

Presentation is loading. Please wait.

Adverse Event Reporting at FDA, Data Base Evaluation and Signal Generation Robert T. O’Neill, Ph.D. Director, Office of Biostatistics, CDER, FDA Presented.

Similar presentations


Presentation on theme: "Adverse Event Reporting at FDA, Data Base Evaluation and Signal Generation Robert T. O’Neill, Ph.D. Director, Office of Biostatistics, CDER, FDA Presented."— Presentation transcript:

1 Adverse Event Reporting at FDA, Data Base Evaluation and Signal Generation Robert T. O’Neill, Ph.D. Director, Office of Biostatistics, CDER, FDA Presented at the DIMACS Working Group Disease and Adverse Event Reporting, Surveillance, and Analysis October 16, 17, 18, 2002; Piscataway, New Jersey

2 Outline of Talk u The ADR reporting regulations u The information collected on a report form u The data base, its structure and size u The uses of the data base over the years u Current signal generation approaches - the data mining application u Concluding remarks

3 Overview u Adverse Event Reporting System (AERS) u Report Sources u Data Entry Process u AERS Electronic Submissions (Esub) u Production Program u E-sub Entry Process u MedDRA Coding

4 Adverse Event Reporting System (AERS) Database u Database Origin 1969 u SRS until 11/1/97 ; changed to AERS u 3.0 million reports in database u All SRS data migrated into AERS u Contains Drug and "Therapeutic" Biologic Reports u exception = vaccines VAERS 1-800-822-7967

5 Adverse Event Reporting System Source of Reports u Health Professionals, Consumers / Patients u Voluntary : Direct to FDA and/or to Manufacturer u Manufacturers: Regulations for Postmarketing Reporting

6

7 Current Guidance on Postmarketing Safety Reporting (Summary) u 1992 Reporting Guideline u 1997 Reporting Guidance: Clarification of What to Report u 1998 ANPR for e-sub u 2001 Draft Reporting Guidance (3/12/2001) u 2001 E-sub Reporting of Expedited and Periodic ICSRs (11/29/2001)

8 Adverse Events Reports to FDA 1989 to 2001

9 Despite limitations, it is our primary window on the real world u What happens in the “real” world very different from world of clinical trials u Different populations u Comorbidities u Coprescribing u Off-label use u Rare events

10 AERS Functionality AERS Functionality  Data Entry  MedDRA Coding  Routing  Safety Evaluation  Inbox  Searches  Reports  Interface with Third-Party Tools  AutoCode (MedDRA)  RetrievalWare (images) AERS

11 AERS Esub Program History u Over 4 years u Pilot, then production. u PhRMA Electronic Regulatory Submission (ERS) Working Group u PhRMA eADR Task Force u E*Prompt Initiative u Regular meetings between FDA and Industry held to review status, address issues, share lessons learned

12 Adverse Event Reporting System Processing MEDWATCH forms u Goal: Electronically Receive Expedited and Periodic ISRs u Docket 92S-0251 u As of 10/2000, able to receive electronic 15-day reports u Paper Reports u Scanned upon arrival u Data entered u Electronic and Paper Reports u Coded in MedDRA

13 Electronic Submission of Postmarketing ADR Reports u MedDRA coding 3500A u Narrative searched with Autocoder u MedDRA coding E-sub u Narrative searched with Autocoder u Enabled: companies accept their terms

14 AERS Esub Program Additional Information u www.fda.gov/cder (CDER) u www.fda.gov/cder/aers/regs.htm (AERS) u Reporting regulations, guidances, and updates u www.fda.gov/cder/aerssub (PILOT) u aersesub@cder.fda.gov (EMAIL) u www.fda.gov/cder/present (CDER PRESENTATIONS)

15 AERS Esub Program Additional Information(cont’d) u www.fda.gov (FDA) u www.fda.gov/oc/electronicsubmissions/interfaq.htm (GATEWAY) u Draft Trading Partner Agreement, Frequently Asked Questions (FAQs) for FDA’s ESTRI gateway u edigateway@oc.fda.gov (EMAIL) u www.fda.gov/medwatch/report/mfg.htm (MEDWATCH) u Reporting regulations, guidances, and updates

16 16 AERS Esub Program Additional Information(cont’d) u www.ich.org (ICH home page) u www.fda.gov/cder/m2/default.htm(M2) u ICH ICSR DTD 2.0 u www.meddramsso.com (MedDRA MSSO) u http://www.ifpma.org/pdfifpma/M2step4.PDF u ICH ICSR DTD 2.1 u http://www.ifpma.org/pdfifpma/e2bm.pdf u New E2BM changes u http://www.ifpma.org/pdfifpma/E2BErrata.pdf u Feb 5, 2001 E2BM editorial changes

17 AERS Users AERS FDA Contractor Safety Evaluators Compliance FOIA

18 Uses of AERS u Safety Signal Detection u Creation of Case Profiles u who is getting the drug u who is running into trouble u Hypothesis Generation for Further Study u Signals of Name Confusion

19 Other references u C. Anello and R. O’Neill. 1998, Postmarketing Surveillance of New Drugs and Assessment of Risk, p 3450-3457; Vol 4, Encylopedia of Biostatistics, Eds. Armitage and Colton, John Wiley and Sons u Describes many of the approaches to spontaneous reporting over the last 30 years

20 Related work on signal generation and modeling u Finney, 1971, WHO u O’Neill,1988 u Anello and O’Neill, 1997 -Overview u Tsong, 1995; adjustments using external drug use data; compared to other drugs u Compared to previous time periods u Norwood and Sampson, 1988 u Praus, Schindel, Fescharek, and Schwarz, 1993 u Bate et al., 1998; Bayes,

21 References u O’Neill and Szarfman, 1999; The American Statistician, Vol 53, No 3; 190-195 Discussion of W. DuMouchel’s article on Bayesian Data Mining in Large Frequency Tables, With an Application to the FDA Spontaneous Reporting System (same issue)

22 Recent Post-marketing signaling strategies : Estimating associations needing follow-up u Bayesian data mining u Visual graphics u Pattern recognition

23 The structure and content of FDA’s database: some known features impacting model development u SRS began in late 1960’s (over 1.6 million reports) u Reports of suspected drug-adverse event associations submitted to FDA by health care providers (voluntary, regulations) u Dynamic data base; new drugs, reports being added continuously ( 250,000 per year) u Early warning system of potential safety problems u Content of each report u Drugs (multiple) u Adverse events (multiple) u Demographics (gender,age, other covariates)

24 The structure and content of FDA’s database: some known features impacting model development u Quality and completeness of a report is variable, across reports and manufacturers u Serious/non-serious - known/unknown u Time sensitive - 15 days u Coding of adverse events (COSTART) determines one dimension of table - about 1300 terms u Accuracy of coding / interpretation

25 The DuMouchel Model and its Assumptions u Large two-dimensional table of size M (drugs) x N (ADR events) containing cross classified frequency counts - sparse u Baseline model assumes independence of rows and columns - yields expected counts u Ratios of observed / expected counts are modeled as mixture of two, two parameter gamma’s with a mixing proportion P u Bayesian estimation strategy shrinks estimates in some cells u Scores associated with Bayes estimates used to identify those cells which deviate excessively from expectation under null model u Confounding for gender and chronological time controlled by stratification

26 The Model and its Assumptions u Model validation for signal generation u Goodness of fit u ‘higher than expected’ counts informative of true drug-event concerns u Evaluating Sensitivity and Specificity of signals u Known drug-event associations appearing in a label or identified by previous analysis of the data base; use of negative controls where no association is known to be present u Earlier identification in time of known drug- event association

27 Finding “Interestingly Large” Cell Counts in a Massive Frequency Table u Large Two-Way Table with Possibly Millions of Cells u Rows and Columns May Have Thousands of Categories u Most Cells Are Empty, even though N.. Is very Large u “Bayesian Data Mining in Large Frequency Tables” u The American Statistician (1999) (with Discussion) u Analyzed SRS Database with 1398 Drugs and 952 AE Codes u N ij = Count of Reports Containing Drug i and Event j u Only 386K out of 1331K Cells Have N ij > 0 u 174 Drug-Event Combinations Have N ij > 1000 u Naïve Baseline Frequencies E ij = N i. N.j / N.. u Extension to Stratification: Sum Independence Frequencies Defined Separately over Strata Based on Age, Sex, etc.

28 Associations of Items in Lists u “Market Basket” Data from Transaction Databases u Tabulating Sets of Items from a Universe of K Items u Supermarket Scanner Data—Sets of Items Bought u Medical Reports—Drug Exposures and Symptoms u Sparse Representation—Record Items Present u P i jk = Prob( X i = 1, X j = 1, X k = 1), ( i < j < k ) u Marginal Counts and Probabilities: N i, N ij, N ijk, … P i, P ij, P ijk u Conditional Probabilities: Prob( X i | X j, X k ) = P ijk / P jk, etc. u P i Small, but  i P i (= Expected # Items/Transaction) >> 1 u Search for “Interestingly Frequent” Item Sets u Item Sets Consisting of One Drug and One Event Reduce to the GPS Modeling Problem

29 Definitions of Interesting Item Sets  Data Mining Literature: Find All (  ) Associations  E.g., Find all Sets ( X i, X j, X k ) Having Prob( X i | X j, X k ) >  Prob( X i, X j, X k ) >  u Complete Search Based on Proportions in Dataset, with No Statistical Modeling u Note that a Triple ( X i, X j, X k ) Can Qualify even if X i Is Independent of ( X j, X k )! u We Use Joint P’s, Not Conditional P’s, and Bayesian Model u E.g., Find all ( i, j, k ): Prob( ijk = P ijk /  ijk > 0 | Data ) >  u  ijk are Baseline Values u Based on Independence or some other Null Hypothesis

30 Empirical Bayes Shrinkage Estimates u Compute Posterior Geometric Mean (  ) and 5th Percentile (  05 ) of Ratios u ij = P ij /  ij, ijk = P ijk /  ijk, ijkl = P ijkl /  ijkl, etc. u Baseline Probs  Based on Within-Strata Independence u Prior Distributions of s Are Mixtures of Two Conjugate Gamma Distributions u Prior Hyperparameters Estimated by MLE from Observed Negative Binomial Regression u EB Calculations Are Compute-Intensive, but merely Counting Itemsets Is More So u Conditioning on N ijk > n* Eases Burden of Both Counting and EB Estimation u We Choose Smaller n* than in Market Basket Literature

31 The rationale for stratification on gender and chronological time intervals u New drugs added to data base over time u Temporal trends in drug usage and exposure u Temporal trends in reporting independent of drug: publicity, Weber effect u Some drugs associated with gender-specific exposure u Some adverse events associated with gender independent of drug usage u Primary data-mining objective: are signals the same or different according to gender (confounding and effect modification) u A concern: number of strata, sparseness, balance between stratification and sensitivity/specificity of signals

32 The control group and the issue of ‘compared to what?’ u Signal strategies compare u a drug with itself from prior time periods u with other drugs and events u with external data sources of relative drug usage and exposure u Total frequency count for a drug is used as a relative surrogate for external denominator of exposure; for ease of use, quick and efficient; u Analogy to case-control design where cases are specific ADR term, controls are other terms, and outcomes are presence or absence of exposure to a specific drug.

33 Other metrics useful in identifying unusually large cell deviations u Relative rate u P-value type metric- overly influenced by sample size u Shrinkage estimates for rare events potentially problematic u Incorporation of a prior distribution on some drugs and/or events for which previous information is available - e.g. Liver events or pre-market signals

34 Interpreting the empirical Bayes scores and their rankings: the Role of visual graphics (Ana Szarfman) u Four examples of spatial maps that reduce the scores to patterns and user friendly graphs and help to interpret many signals collectively u All maps are produced with CrossGraphs and have drill down capability to get to the data behind the plots

35 Example 1 A spatial map showing the “signal scores” for the most frequently reported events (rows) and drugs (columns) in the database by the intensity of the empirical Bayes signal score (blue color is a stronger signal than purple)

36

37 Example 2 Spatial map showing ‘fingerprints’ of signal scores allowing one to visually compare the complexity of patterns for different drugs and events and to identify positive or negative co-occurrences

38

39 Example 3 Cumulative scores and numbers of reports according to the year when the signal was first detected for selected drugs

40

41 Example 4 Differences in paired male-female signal scores for a specific adverse event across drugs with events reported (red means females greater, green means males greater)

42

43 Why consider data mining approaches u Screening a lot of data, with multiple exposures and multiple outcomes u Soon becomes difficult to identify patterns u The need for a systematic approach u There is some structure to the FDA data base, even though data quality may be questionable

44 Two applications u Special population analysis u Pediatrics u Two or more item associations u Drug interactions u Syndromes (combining ADR terms)

45 Pediatric stratifications (age 16 and younger) u Neonates u Infants u Children u Adolescents u Gender

46 Item Association u Outcomes u Drug exposures - suspect and others u Events u Covariates u Confounders u Uncertainties of information in each field u dosage, formulation, timing, acute/chronic exposure u Multiplicities of dimensions

47 Why apply to pediatrics ? u Vulnerable populations for which labeling is poor and directions for use is minimal - a set up for safety concerns u Little comparative clinical trial experience to evaluate effects of u Metabolic differences, use of drugs is different, less is known about dosing, use with food, formalations and interactions Gender differences of interest

48 Challenges in the future u More real time data analysis u More interactivity u Linkage with other data bases u Quality control strategies u Apply to active rather than passive systems where non-reporting is not an issue


Download ppt "Adverse Event Reporting at FDA, Data Base Evaluation and Signal Generation Robert T. O’Neill, Ph.D. Director, Office of Biostatistics, CDER, FDA Presented."

Similar presentations


Ads by Google