Download presentation
Presentation is loading. Please wait.
Published byJessie Franks Modified over 9 years ago
1
Measuring IRB Effectiveness Norman Fost MD MPH Departments of Pediatrics and Medical History & Bioethics University of Wisconsin SACHRP Meeting, Alexandria VA July 21, 2009
2
Bias Disclosure Chair of Health Sciences IRB for 31 years For cause visit by OPRR commendation letter Co-PI controversial large randomized clinical trial lawsuit Human Subject Lifetime Achievement Award (OPRR) for Human Subjects Protection
3
Measuring IRB Effectiveness: Points to consider Clarifying the purpose of IRBs Effectiveness of system as a whole What not to measure Zero risk fallacy Consistency Reconsider relevance of consent
4
Purpose of IRBs Protect human subjects from harm Facilitate ethically responsible research Facilitate investigator careers Protect institution from harm
5
Purpose of IRBs Protect human subjects From research related harm Too narrow Maximized by eliminating research “Committee For the Prevention of Research” They are at risk of harm from disease Sometimes willing to accept risks in exchange for a possible benefit Thus, change in waived consent regs for emergency research. Evidence that patient would want it that way.
6
Other purposes of IRBs Facilitate ethically responsible research Jonas: research optional Public: wants progress Facilitate investigator careers IRB is scapegoat for unacceptable delay Protect institution from harm Avoid shutdowns/ lawsuits Hopkins/Duke shutdowns Effectiveness must include all the goals
7
Effectiveness of system as a whole Origin of IRBs was scandals: egregiously unethical research Nuremberg, Southam, Tuskegee etc Beecher 1966: Today scandals are rare, and rarely related to IRB failure Gelsinger, Hopkins, Rochester OIG: “System in jeopardy” “A Time for Reform”
8
“System in jeopardy” (OIG) OHRP Director Ellis stated that “when you set aside the language of danger and menace,” the OIG report offers no evidence that patients have been harmed or are at risk. Noting that every clinical trial goes through many layers of ethical review, Ellis said he considered the likelihood of a “catastrophic failure” to be “slight.”
9
Protections in the system Sponsors (private/public) NIH review groups Industry lawyers FDA Data Monitoring Committees PI’s better educated: journals, confs Journals more attentive to applications Research ethics consultants CTSA infrastructure (scientific review)
10
What not to measure Documentation of compliance with regulations that have little/no relationship to protection of human subjects Continuing review Change in protocol Conditional approval
11
What not to measure Actual compliance with regulations that have little relationship to protection of human subjects UWHS IRB documented non-compliance with requirements for continuing review Supported by OPRR Rejected by AAHRPP
12
What not to measure Obsession with compliance has Shifted IRB primary role to protection of institution Dramatic increase in costs Distracted attention from more meaningful activity Consent monitoring 9/11 and the tray table rule Effectiveness (process) is ~ 100%
13
Zero risk fallacy Not all deaths are due to IRB failure Not all deaths are due to system failure Gelsinger (Penn) Roche (Hopkins) Zero deaths are not possible and zero tolerance is not desirable. Cost will be incompatible with with desired research This is not a standard for other public goods Food, housing, transportation
14
Consistency (between IRBs) “A foolish consistency is the hobgoblin of little minds.” “ …adored by little statesmen and philosophers and divine" RW Emerson
15
What does this mean? “Too much consistency, which you usually find in people who are not creative thinkers and choose to stick to what is known and constantly done, cuts off the flow of progress through fresh ideas and differing viewpoints. These little statesmen, philosophers and clergy cling to what they know and are familiar with, rather than expanding their vision to include new thoughts and methods.”
16
Consistency (between IRBs) “A foolish consistency is the hobgoblin of little minds.” “ …adored by little statesmen and philosophers and divine" RW Emerson Examples Waiver of consent for emergency research Off-loading continuing review and changes Minimal discussion of NCI protocols Pride in questioning minimal risk research “M” protocols = “More” discussion Variation is not implicitly bad
17
Reconsidering consent There is no de facto federal requirement for informed consent; only for disclosure. Informed consent (“think with”) implies as understanding choice. That is rarely assessed or documented and is not required by the regulations
18
Reconsidering consent If this is a serious problem, the regulations need to be re-written, or re-interpreted I.e., Guidance document on requiring assessment of comprehension This would be a radical shift Stop calling it consent. Call it disclosure. Measured by independent assessment of completeness (all the elements) and understandable to 6 th grade reader (Duh). Consider a requirement for actual consent for high risk studies; I.e., consent monitoring
19
What should be measured? Sentinel events (“Failure analysis”) Hopkins external review Commitment reports If spending undue time on AE’s, CR, and COP consider it a red flag Central IRB’s Efficiency of affecting more sites/subjects
20
Conclusions Define the extent of the problem: Is the system really “in jeopardy”? What is incidence of clearly unethical research (the reason for IRBs)? Focus on the research, not IRBs IRB process is a surrogate measure
21
Conclusions Evaluate the effectiveness of the entire system, not just IRBs E.g., Pharma problems Biased design Concealment of SAEs Ghost writing results IRBs have no control over this Much higher yield for protecting human subjects than measure IRB effectiveness Sutton’s Law: “Go where the money is.” Study the Human Research Protection Program Measure incidence of problematic research
22
Conclusions Stop measuring documentation of compliance with requirements with low predictive value for Stop punitive sanctions for non-compliance Treat inconsistency as an opportunity, not a conclusion Educate public/Congress on the zero risk fallacy July 21, 2009
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.