Developing a Validated Tool For Evaluating Whole Slide Images.

Slides:



Advertisements
Similar presentations
Marylands Technology Education Voluntary State Curriculum 2007 Bob Gray Center for the Teaching of Technology & Science (ITEA-CATTS) and the University.
Advertisements

International Telecommunication Union Workshop on Standardization in E-health Geneva, May 2003 Digital Imaging in Pathology for Standardization Yukako.
Administrators Meeting April 21, Key Areas of Grant-Based Monitoring Schools to be Served Instructional Assessments Instructional Strategies and.
The Impact of Gynecologic Pathology Diagnostic Errors on Patient Care Dana Marie Grzybicki MD, PhD Colleen M. Vrbin, BA Danielle Pirain, BS Stephen S.
NIHR Research Design Service London Enabling Better Research Forming a research team Victoria Cornelius, PhD Senior Lecturer in Medical Statistics Deputy.
National Center on Response to Intervention RTI Implementer Webinar Series: What is Screening?
National Standards In Pathology Education Barbara Knollmann-Ritschel, M.D.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
The Research Problem PE 357. Selecting the problem Can be for research or a literature review -To break the problem down more … -needs to be of interest.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
READI – Readiness Evaluation and Discharge Interventions Study
Sampling and Data Collection
Prostate Needle Biopsy: The Pitfalls and the Role of the Pathologist – Patient Track Prostate Cancer Symposium “Intriguing Cases / Emerging Strategies.
Political and Strategic Aspects of Employing Digital Pathology in Pathology Departments John D. Pfeifer, MD, PhD Vice Chair for Clinical Affairs, Department.
Whole Slide Image Based Interpretation of Immunohistochemistry Stains in Challenging Prostate Needle Biopsies Jeffrey L Fine MD, Jonhan Ho MD, Yukako Yagi,
Introduction to Conducting Research. Defining Characteristics of Conducting Research An inquiry process that involves exploration. Taps into the learner's.
Development of CAP Standards for Digital Pathology That Would be Important for CAP Accreditation of Pathology Labs as We Transition Into a Digital Era.
Fund of Knowledge: Basic research methodology Pre-test mean: 56% ± 8% Post-test mean: 65% ± 6% N=11, p value
The Importance of Creating an Environment of Mutual Respect In the Classroom Amanda J. Watson, PhD Murray State University Background Promoting an academic.
Induction to assessing student learning Mr. Howard Sou Session 2 August 2014 Federation for Self-financing Tertiary Education 1.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
1 Introduction to Grant Writing Beth Virnig, PhD Haitao Chu, MD, PhD University of Minnesota, School of Public Health December 11, 2013.
Evaluation Presentation How does evaluation affect your program? L&D Associates.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Starting, storming, solving, selling: Teaching problem- solving and teamwork skills through a four-step model. Lynn Donahue, Ed.D. Communication/Journalism.
Overview of Standard Setting Leslie Wilson Assistant State Superintendent Accountability and Assessment August 26, 2008.
EEA 2012 – Middle School STEM Day 3 Content Session.
CAP -State Pathology Society Leadership Conference June 29, 2007.
* Research suggests that technology used in classrooms can be especially advantageous to at-risk, EL, and special ed students. (Means, Blando, Olson,
© 2012 Cengage Learning. All Rights Reserved. This edition is intended for use outside of the U.S. only, with content that may be different from the U.S.
Slide 1 Community Networks to Reduce Cancer Health Disparities Pre-Application Conference May 26, 2004 Bethesda, MD Kenneth C. Chu, PhD Chief, Disparities.
Medical Students’ Self-Ratings of Interprofessionalism Knowledge & Performance Before & After Simulation-Based Education David B. Trinkle, MD; David W.
Wednesday August 16, 2006 APIII 2006 Vancouver Breakout Session A2 Integrating whole slide images into clinical workflow Jonhan Ho, MD Dermatopathology.
Feasibility and Value of a Procedural Workshop for Surgery Residents Based on Phase-II of the APDS/ ACS National Skills Curriculum Dimitrios Stefanidis.
Prostate Cancer CAD Michael Feldman, MD, PhD Assistant Professor Pathology University Pennsylvania.
Do Easier Classes Make for Happier Students? Amanda J. Watson, PhD Murray State University Background Grade inflation has been of concern in higher education.
Digital Imaging in Pathology Bruce E. Dunn, M.D. Milwaukee VA Medical Center
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Use of Active Learning for Selective Annotation of Training Data in a Supervised Classification System for Digitized Histology Scott Doyle 1, Michael Feldman.
Digital Imaging in Education and Distributed Pathology Practice
Challenges using Safety Monitoring Systems A review of Integrating Incident Data from Five Reporting Systems to Assess Patient Safety: Making Sense of.
Issues in Validation of Whole Slide Digital Images for Frozen Section Interpretation Lewis A. Hassell, MD Pathology Visions October 2010.
Adult Education Assessment Policy Effective July 1 st, 2011.
1 Science, Learning, and Assessment: (Eats, Shoots, and Leaves) Choices for Comprehensive Assessment Design Eva L. Baker UCLA Graduate School of Education.
CaBIG Architecture Working Group Face-To-Face Meeting  Best Practices SIG  March 18th, 2005  David Kane and Jim Harrison.
Assistant Instructor Nian K. Ghafoor Feb Definition of Proposal Proposal is a plan for master’s thesis or doctoral dissertation which provides the.
The AMUsE Model: A Strategy to Assess Attitudes, Motivation, Utility and self-Efficacy in Interprofessional Team Training.
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
Common Core State Standards and Assessments of Student Mastery 1.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Department of Defense Voluntary Protection Programs Center of Excellence Development, Validation, Implementation and Enhancement for a Voluntary Protection.
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
OCLC Online Computer Library Center 1 Introduction.
National Public Health Performance Standards Program: A Users Perspective Judy Monroe, MD Indiana State Health Commissioner APHA Annual Meeting November.
DEMONSTRATING TOOLS FOR SUPPORTING PROGRAMS ALONG DIFFERENT STAGES OF THE EVALUATION LIFE CYCLE JBS International 10 year whole school initiative.
Leacock, Warrican and Rose (2009)
Technical Session for Preparing MELP Report
Urologist’s Impact on Extended Needle Core Prostate Biopsy Histopathologic Variables within a Single Institution Kashika G. Goyal B.S.1, Joshua J. Ebel.
Preparing to Teach and Overview of Teaching Assignments
Digital Pathology Early Adoption Douglas J
Improve Phase Wrap Up and Action Items
Introduction to Projects
EIA TRAINING RESOURCE MANUAL FOR SOUTH EASTERN EUROPE
Identifying Multiple Measures and Defining Significance
캐나다.
Preparing to Teach and Overview of Teaching Assignments
Lynn Elinson, Ph.D. Project Director
Managerial Decision Making and Evaluating Research
Presentation transcript:

Developing a Validated Tool For Evaluating Whole Slide Images

University of Pittsburgh Department of Biomedical Informatics Authors Dana Grzybicki, MD, PhD Evaluation Team – Principal Investigator Russell Silowash, BS Evaluation Team – Research Analyst Robb Wilson, MA Evaluation Team – Project Manager Leslie Anthony, MA UPMC IMITS Telepathology Project – Project Manager

University of Pittsburgh Department of Biomedical Informatics Background Through appropriations in the defense-spending bills for 2002 and 2004, the University of Pittsburgh Medical Center (UPMC) and the United State Air Force Medical Service (AFMS) created a partnership called the Integrated Medical Information Technology System (IMITS) Program Telepathology is a branch of the IMITS program that implements and validates digital pathology practices

University of Pittsburgh Department of Biomedical Informatics Introduction Purpose of evaluation research –Independent examination of questions related to: Technical validity Feasibility Effectiveness To our knowledge, there are no generally available validated tools for evaluating WSI cases The UPMC Digital Pathology Imaging Group is working on the validation of a unique evaluation tool

University of Pittsburgh Department of Biomedical Informatics Participants The validation of this tool was part of our telepathology evaluation project involving 5 UPMC pathologists –2 pathology fellows –3 staff pathologists with training in GU pathology

University of Pittsburgh Department of Biomedical Informatics Case Selection 30 difficult prostate biopsy foci DiagnosisNumber of Cases Adenocarcinoma12 Atypical6 Atypical PIN (ATYPIN)1 High Grade PIN3 Benign8

University of Pittsburgh Department of Biomedical Informatics

University of Pittsburgh Department of Biomedical Informatics

University of Pittsburgh Department of Biomedical Informatics Hypotheses Content Validity –The assessment is asking the proper questions for the study at hand Internal Validity –There will be a positive correlation between the number of slides/images in a case and the time needed to complete the case

University of Pittsburgh Department of Biomedical Informatics Hypotheses (continued) External Validity –Whole slide image quality will be positively correlated with glass slide quality Construct Validity –There will be a negative correlation between the diagnostic confidence of a participant and the case complexity rating

University of Pittsburgh Department of Biomedical Informatics Content Validity Results Content validity has been obtained by gaining feedback from pathologists that are part of the Digital Pathology Imaging Group (DPIG)

University of Pittsburgh Department of Biomedical Informatics Internal Validity Results Statistically significant for only one participant (r 2 =0.327, p<0.01) Our inability to demonstrate internal validity for most of the participants was most likely due to time categorical variables that were too broad

University of Pittsburgh Department of Biomedical Informatics External Validity Results A positive correlation exists between WSI quality and glass slide quality There were statistically significant positive correlations for 3 of the 5 subjects

University of Pittsburgh Department of Biomedical Informatics Construct Validity Results Negative correlation exists in WSI phase of study between confidence in diagnosis and case complexity 3 of 5 pathologists had statistically significant positive correlations in the WSI Phase

University of Pittsburgh Department of Biomedical Informatics Construct Validity Results (continued) Negative correlation exists in the Glass phase between case complexity and diagnostic confidence 3 of 5 pathologists had statistically significant positive correlations in the Glass Phase

University of Pittsburgh Department of Biomedical Informatics Summary We are able to establish content, external and construct validity, however internal validity at this point has not been definitively established. –Low internal validity could be due to time categories being too broad

University of Pittsburgh Department of Biomedical Informatics Next Steps Change the categories for the time variable Time to Complete (Minus Interruptions) Less Than 15 Minutes 15 – 30 Minutes 30 – 45 Minutes 45 – 60 Minutes Over 60 Minutes Time to Complete (Minus Interruptions) Less Than 30 Seconds7 – 10 Minutes 30 Seconds – 1 Minute10 – 12 Minutes 1 – 2 Minutes12 – 15 Minutes 2 – 5 MinutesOver 15 Minutes 5 – 7 Minutes Implement an automatic timing solution

University of Pittsburgh Department of Biomedical Informatics Next Steps (continued) Continue the validation of this tool by obtaining additional data and testing internal validity utilizing the modified time variables

University of Pittsburgh Department of Biomedical Informatics Funding This work was supported by funding from the U.S. Air Force administered by the U.S. Army Medical Research Acquisition Activity, Fort Detrick, Maryland (Award No. W81XWH and Contract No. DAMD ). The content of the information does not imply U.S. Air Force or Government endorsement of factual accuracy or opinion