Download presentation
Presentation is loading. Please wait.
Published byBarbara Hodgson Modified over 10 years ago
1
Impact of Electronic Tools on Review Management Center for Scientific Review National Institutes of Health Department of Health and Human Services Thomas Tatham, Ph.D. IT LIAISON
2
Electronic Tools Application CDs Internet Assisted Review QVR Person Search Meeting and reviewer management tools developed by SRAs Technologies Under Evaluation
3
Electronic Tools: Application CDs CSR provides CDs containing scanned (or electronically submitted) application images to all reviewers except when a meeting contains only a few applications
4
Before Application CDs… Reviewers ship or carry large stack of applications
5
Applications CDs: Reviewers bring laptops
6
CD Welcome Screen After CD is inserted
7
Application CD Main Menu (auto starts when CD inserted)
8
X X X X X X 01 - 01 List of Applications on CD
9
CD Advantages… CSR no longer duplicates applications for reviewers and program staff Reduced costs Mailing to reviewers; reviewers shipping to meeting Processing room space CDs are searchable More information available to reviewers All PAs relevant to meeting Relevant review guidelines Prior summaries for all applications
10
CD Advantages… More portable for reviewers Most reviewers have responded very favorably
11
CD Complications… Some reviewers prefer paper Assigned applications still sent as hardcopy Not all reviewers have a laptop to bring to meeting CSR rental program Conflict CDs are expensive but generally rare Power distribution in meeting room Need to discourage reading of reviews from screen Greater likelihood of keeping confidential materials after meeting? Do some reviewers read their e-mail during meeting? Future: will CDs provide sufficient capacity for large numbers of electronically submitted applications?
12
NIH is currently using an online peer review tool that is a component of NIH eRA Commons and integrated with the computer-based information system of the extramural program (IMPAC II) Today, almost all CSR peer review meetings use IAR Electronic Tools: Internet Assisted Review (IAR)
13
Reviewers may: Electronically submit critiques and preliminary scores prior to the meeting Read critiques submitted by others online Modify their critiques after the meeting View scanned grant image online Maintain their personal information (single point of ownership) Scientific Review Administrators (SRAs) may: Do anything reviewers can do in the system Manage meetings and Reviewers IAR access via Control Center Designate applications as lower half (streamline) Download preliminary summary statements generated from submitted critiques IAR Capabilities
14
Reviewers List of Meetings
15
Submit Phase: List of Applications
16
Submit Phase: Submit Screen
17
Read Phase: List of Applications
18
List of Applications: Edit Phase
19
IAR Benefits Facilitates informed discussion among study section members Enhances review meeting efficiency by focusing attention on those applications most in need of discussion Enhances review meeting effectiveness by allowing reviewers to review or check facts used in other reviewers arguments before the start of the review meeting Helps SRA identify reviewers in need of further guidance Opportunity to review critiques of new reviewers and offer feedback Identify instances of lack of adherence to review guidelines Provides opportunity to inform program of likelihood that specific applications are likely to be discussed
20
New IAR Features - Near Future Meeting folder where SRA can post: late arriving materials received in PDF format, guidelines, travel instructions, etc. SRA must be sensitive to competitive conflict issues Automatic summary statement draft generator will: Insert critiques in order Insert animal, human subjects, budget and other headings and boilerplate based on IMPAC II codes Dramatically reduce support staff requirements, although Abstracts must still be cut-and-pasted from scanned application
21
IAR Potential Features (Longer Term – Still Conceptual) System could become a single point of access for all materials associated with a review meeting and give reviewer application-by-application option for: Viewing application online and/or printing locally Requesting CD Requesting hardcopy of assigned application Production and shipping of CDs and hardcopy could be automated and centralized Model will work best when most applications, including appendices, submitted electronically
22
Electronic Tools: QVR Person Search Considerations when recruiting reviewers: Ethnic, geographic, and gender diversity Expertise History of funding from NIH or other competitive sources Publication history Conflict of interest NIH review experience NIH application experience Academic degrees and professional certifications Willingness to review Small business expertise (for SBIR applications) Clinical experience Fairness
23
QVR Overview QVR is one of several search and reporting tools for IMPAC II Data are refreshed overnight Available to all personnel with IMPAC II credentials Two main modes Main – search for applications e.g., all NCI R21s reviewed by CSR in 2003 that were unscored Person Search e.g., all MDs from Arkansas who served on an SBIR panel in the past 3 years
24
QVR Person Search Data span past 12 years All people who have applied for grant All people who have served as a reviewer; a reviewer may or may not have application history All people who have served in some other capacity, such as National Advisory Council or Board of Academic Counselors Results available as onscreen display with drilldown, Excel spreadsheet download and comma-separated values (csv) file download
25
QVR Person Search Criteria: Name/Location and Degrees/Expertise
26
QVR Search Criteria: Committee Service
27
QVR Search Criteria: Grant History Never applied, applied not funded, funded, active
28
QVR Search Results List Link drills down to person details
29
QVR Person Information Drilldown: Committee Service Link to roster Link to SRA e-mail
30
QVR Person Information Drilldown: Contact Information
31
QVR Person Information Drilldown: Grant History Links to documentsReview status and outcome
32
QVR Person Information Drilldown: Expertise
33
QVR Person Search Advantages Excellent User Interface Requires no programming expertise yet very flexible and fast If person is in IMPAC II, provides data on Geographic distribution Expertise History of funding from NIH Publication history NIH review experience NIH application experience Academic degrees
34
QVR Person Search Limitations Ethnic and gender data are restricted and cannot be appropriately displayed in QVR because tool is available to all IMPAC II users Does not provide access to people who have not had contact with NIH as an applicant or through committee service Impedes recruitment of potential reviewers from other scientific communities Publication data limited to Pubmed citations so publications in some disciplines (e.g., materials science, behavioral science) not well captured; Pubmed search does not work well for people with very common names Limited by IMPAC II data quality – contact info may be out of date, expertise terms entered by SRAs or ESAs rather than taxonomic expert; terms are not reliably maintained Does not capture less tangible aspects of review, such as fairness, professional reputation, willingness to serve
35
Person Searching: The Future Augmenting IMPAC II database with automated analysis of scientists expertise based on semantic analysis (fingerprint) of grant applications, publications, data from non-NIH sources Identifying potential reviewers by matching applications semantic fingerprint to reviewers with similar fingerprints Using a persons semantic profile to find similar individuals
36
Meeting and Reviewer Management Systems Developed by SRAs Complement enterprise systems Informally developed, distributed, supported Often start as lone effort and become informal group efforts Tend to address: Information regarding SRAs reviewers Expertise, address, cv, etc. Balancing reviewer work load Note taking at meeting Often implemented in Excel, Access, or Filemaker Pro Often based on downloading data from IMPAC II
37
Meeting and Reviewer Management Systems Access-Based Example Application Screen Area for notes taken at meeting Human Ss codes Pre-meeting notes IMPAC II data in white
38
Meeting and Reviewer Management Systems Access-Based Example Reviewer Data Screen
39
Meeting and Reviewer Management Systems Access-Based Example Reviewer Workload Reports Workloads for SBIR meetings Workloads for regular meetings
40
Electronic Tools: Under Evaluation Dual, large displays to accommodate move away from paper CSR speech recognition pilot Wireless broadband Internet access
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.