Presentation on theme: "From Learning Objectives to Outcomes Assessment During Simulation Paul E. Phrampus Director Winter Institute for Simulation, Education and Research (WISER)"— Presentation transcript:
From Learning Objectives to Outcomes Assessment During Simulation Paul E. Phrampus Director Winter Institute for Simulation, Education and Research (WISER) University of Pittsburgh
Why What When Where How Assessment
The Future of Simulation………
What is a simulation for? 1.Education 2.Assessment 3.Fun 4.1 and 2
TESTINGTEACHING The Simulation Advantage Psychomotor Skills Decision Making Skills and Decisions Individuals Teams
What is Simulation….. Really…..
The Job of Healthcare Educators Psychomotor Skills Communications Skills Professionalism Skills Decision Making Base Knowledge Teamwork Skills
Why is this so hard?
Learning Objective You have to have a clear reason of why you are running any simulation.
What is most important aspect to debrief?
Where does a simulation fit? It depends on what your doing!
EACH Sim is part of a bigger plan!
What is this?
Does it belong here?
Course (Global) objectives v sim objectives
Global Objectives Course Objectives – 1. Demonstrate a proper patient assessment – 2. Demonstrate following the emergency protocol – 3. Demonstrate Airway Management Skills
Assessment and Evaluation Assessment and Evaluation often used interchangeably Assessment – learner outcomes Evaluation – course or program outcomes
Rate the Performance Pass or Fail Score Performance Procedural Check
Assessment Why What When Where How Rosen, MA et al. Measuring Team Performance in Simulation-Based Training: Adopting Best Practices for Healthcare. Sim Healthcare 3:2008;33–41.
ACTION PERFORMANCE COMPETENCE KNOWLEDGE Miller GE, The assessment of clinical skills/competence/performance. Academic Med Sep;65(9):S63-7. Assessment of clinical skills/competence/performance
Clinical Competence & Performance To attain competent performance – basic Knowledge, Skills & Attitudes required Competence – is the application of specific KSAs. Performance – is the translation of competence into action
Performance Assessment Basic to performance – Do they know it and know how? Competence – Can they do it? Performance – Do they do it?
DOES SHOWS HOW KNOWS HOW KNOWS Knows When Miller GE, The assessment of clinical skills/competence/performance. Academic Med Sep;65(9):S63-7. Assessment of clinical skills/competence/performance
DOES SHOWS HOW KNOWS HOW KNOWS Knows When Miller GE, The assessment of clinical skills/competence/performance. Academic Med Sep;65(9):S63-7. Assessment of clinical skills/competence/performance Knowledge Tests MCQs, Essay, Oral exam Clinical Based Tests Pt Mgt Qs, Essay, Oral Performance Tests OSCE, TaskDemo, SIM/SP Practice Review Audits, Video, UC-SP, 360 O
Assessment Formative Assessment – Lower stakes assessment – One of several assessments over time of course or program – May be evaluative, diagnostic, or prescriptive – Often results in remediation or progression to next level Summative Assessment – Higher stakes assessment – Generally final course or program assessment – Primary purpose is performance evaluation – Often results in a Go-No Go outcome
STUDENT EVALUATOR ASSESSMENTPATIENT Variables in Assessment
Assessment Validity Validity – – Degree to which an assessment tool measures what it is intended to measure – Construct validity – Accuracy of tool to measure intended construct Concurrent validity Content validity Predicative validity
Assessment Reliability Test Reliability – – consistency of a test to yield the same measure over time (does not assure validity) – Test-Retest Reliability consistency among different administrations – Parallel Forms Reliability to assure that memory effects do not occur use a different pre- and posttest – Inter-Rater Reliability two or more observers rate the same subjects with strong positive correlation of their ratings
STUDENT EVALUATOR ASSESSMENTPATIENT Variables in Assessment
Assessment Improvement Improving Reliability and Validity – Base assessment on outcome/objectives > event triggers > observable behavior > behavioral rating > assess against competencies – Define performance – Use of Rubric or rating metric – Use (video) training examples of performance – Employ a quality assurance/improvement system
Assessment Based on Objectives and Events Rosen, MA, et al A measurement Tool for Simulation –Based Training in Emergency Medicine: The Simulation Module for Assessment of Resident Targeted Event Responses (SMARTER) Approach. Sim Healthcare 3:2008;
Event – Behavior – Scale - Report Rosen, MA, et al. Measuring Team Performance in Simulation –Based Training: Adapting Best Practices for Healthcare. Sim Healthcare 3:2008;33-41.
Assessment Metrics Procedural or Check List assessment Global Rating assessment
Assessment Metrics Procedural or Check List assessment BCLSYN Open Airway Check Breathing BCLSYN Open Airway (< 5 sec of LOC) Check Breathing (< 5 sec of Airway) BCLSYN Open Airway Check Breathing A Rating Score +1 0 *Assist
Assessment Metrics Global Rating assessment Code BluePF CPR and ACLS Code Blue CPR points ACLS points Code BlueHM CPR ACLS L Rating Score +1 0 Pts.
Assessment Metrics Procedural or Check List assessment Global Rating assessment Pros and Cons to each Multiple modes & points vs. Single assessment QA/QI Metric – collective vs single assessment
Brett-Fleegler, MB et al. A simulator-based tool that assesses pediatric resident resuscitation competency. Pediatrics 121(3):2008;
Gaba, DM et al. Assessment of Clinical Performance during Simulated Crises Using Both Technical and Behavioral Ratings. Anesthesiology, 89(1)July 1998:8-18.
Guise, JM et al. Validation of a Tool to Measure and Promote Clinical Teamwork. Sim Healthcare 3:2008;217–223.
Evaluator 9 Evaluator 8 Evaluator 7 Evaluator 6 Evaluator 5 Evaluator 4 Evaluator 3 Evaluator 2 Evaluator 1 5 Scenario Videos Novices Interrater Reliability of Data Collected with SimMan During Difficult Airway Simulations Andrus, Phrampus, Wang
Interrater Reliability of Data Collected with SimMan During Difficult Airway Simulations Andrus, Phrampus, Wang
Interrater Reliability of Data Collected with SimMan During Difficult Airway Simulations CONCLUSIONS: – Simulation instructors exhibited good overall inter-rater agreement but showed poor agreement for some key airway tasks. – Variability in instructor ratings may influence real- time evaluation of simulation performance. – Likely could be improved with careful instructor orientation / training Andrus, Phrampus, Wang
Evaluation Individual Trainee Evaluation – Per episode performance – Course of the Day
Evaluation Group Trainee Evaluation – Benchmark
Evaluation Instructor Evaluation – Per episode performance – Measure Against Benchmarks
Testing Points T I M E
Behind the Scenes
Summary Know why you are doing the sim Each sim is part of a bigger picture Design tools for consistency Make it easy for your faculty!