Presentation is loading. Please wait.

Presentation is loading. Please wait.

Wendy Thomson, EdD, MSN, BSBA, RN, CNE, CHSE

Similar presentations


Presentation on theme: "Wendy Thomson, EdD, MSN, BSBA, RN, CNE, CHSE"— Presentation transcript:

1 Impact of Simulation on Nursing Education, Clinical Learning and Outcomes
Wendy Thomson, EdD, MSN, BSBA, RN, CNE, CHSE University of South Florida Director, Simulation Education Director, Masters in Nursing Education Concentration

2 Disclosures Member of the INACSL Standards Committee 2015-2017
No other Disclosures

3 Objectives By the end of this session, you should be able to:
Identify the key findings in the National NCSBN Simulation Study Interpret the findings of the NCSBN Simulation Study for your nursing program and nursing education Explain how the INACSL Standards of Best Practice: Simulation is to be used within simulation programs Discuss the potential implications of the NCSBN findings

4 Simulation “ ...technique, not a technology, to replace or amplify real experiences with guided experiences, often immersive in nature, that evoke or replicate substantial aspects of the real world in a fully interactive fashion” (Gaba, 2004, p. i2)

5 Why Simulation? Enormous amount of content and skills to be taught
Programs getting shorter Decrease in clinical time Limited clinical settings Limited ability to engage in the role of the nurse Increase in patient complexity Disparity between “entry into practice” and “reality”

6 What do we know about simulation
Simulation based education shouldn’t be an extra-ordinary activity, added into an already overloaded curriculum. Simulation must start at the beginning and be built into the normal training programs of various healthcare providers. Lacking proof that this is a good thing to do. Finally the NCSBN National simulation Study has results that we can use to justify what we have done and what we know already

7 NCSBN Simulation Study - 2013
Up to 50% simulation can be effectively substituted for traditional, business as usual, clinical experiences. Up to 50% simulation can be effectively used in various programs, in different geographic locations, with good outcomes Simulation usage doesn’t affect NCLEX pass rates

8 Caveats/Recommendations
Simulation usage can only be substituted for clinical when: High Quality Simulation Scenarios are used Faculty are dedicated and trained Adequate numbers of simulation faculty to support learners Debriefing methods are grounded in best educational practice INACSL standards of best practice are followed Equipment and supplies create a realistic environment

9 Other findings 50% group were stress inoculated so those participants had less stress in practice. There was a statistically significant difference on the end of course ATI™ examinations between the groups, with the 50% group scoring better than the other two groups. There was more failures in the control group, least failures in the 50% However, not statistically significant

10 High Quality Simulation Scenarios
Scenarios followed NLN/Jeffries Framework Model Ensures scenarios all had the same elements Consistency Published scenarios vs home grown? Model is based on Chickering and Gamson which states: Time on task, with a consistent approach and a lot of prompt feedback.

11 High Quality Simulation Scenarios
What does this mean for teaching and faculty? Valid and reliable scenarios within your organization Vet scenarios through a task force or committee All scenarios should have the same components Students know what they are getting into because they had it before Consistency Need training, can’t assume you know

12 Faculty are dedicated and trained
Dedicated faculty create consistent experiences Outcomes can be measured Mandatory Faculty training in facilitation and debriefing. Select a debriefing method for your institution Ask the faculty what method they use! Ask how they know they are using it correctly! Teach the debriefing method By a SME

13 Faculty are dedicated and trained
Practice debriefing with real simulated simulation learning experiences Evaluate debriefing via peer and expert evaluation DASH rating tool >=7 Continuous monitoring of faculty debriefing skill competency 2x/semester

14 Faculty are dedicated and trained
Prevents “Diminishing Returns Phenomenon” If we do the same thing too much we’ll start to feel burned out and may even start performing poorly. Avoid grandfathering faculty. Even the most experienced faculty/nurse/provider gets complacent Ensure they follow your institutions way of debriefing

15 Debriefing methods are grounded in best educational practice
Select a debriefing method Everyone should have a say but use more than plus/delta Train the dedicated faculty on the debriefing method Don’t assume a CEU offering or 1-3 conferences makes them competent! Don’t grandfather anyone in by saying they have done this for a long time. Baseline all your simulation faculty

16 Debriefing methods are grounded in best educational practice
Validate competency in skills of debriefing DASH Objective Structured Assessment of Debriefing (OSAD) (Paige, Arora, Fernandez, & Seymour, 2015) Without continuous quality improvement, we can’t be sure faculty maintain the level of skill unless we retrain periodically. Inter-rater reliability Law of Diminishing Returns (also known as Diminishing Returns Phenomenon) Validate each facilitator every semester or at least every academic year

17 Poor Debriefing (Silberman, 2007, p 70)
Clichéd conversations with no questioning or learning Meandering discussion going wherever the most dominant people happen to take it Paralysis by analysis with learning stagnating at the investigation stage Post-mortems, producing a distorted negative bias that drains energy Jumping to false conclusions by missing out on significant states Future planning that is not well-grounded in what was learned from the experience. Chaos & conflict with people being out of sequence with each other (while 1 person is talking about the future, another is still “in the exercise,: another is speaking her mind, another is excited about a personal insight and so on!)

18 INACSL Standards of Best Practice: Simulation are followed
Standard I – Terminology – Provide Consistency Standard II – Professional Integrity of Participants Standard III – Participant Objectives – Clear and Measurable Standard IV – Facilitation – Multiple Methods Standard V - Facilitator – Proficiency Standard VI – Debriefing Process – Improve Practice Through Reflection Standard VII - Participant Assessment and Evaluation Standard ?? – Simulation Design Standard ?? - IPE

19 Highlights of the Standards
Provide honest and clear feedback in an effective, respectful manner Learners should receive/provide timely constructive feedback Protect the scenario content Measurable Objectives that address the domains of learning and be achievable based on the knowledge and level of the learner. Facilitator has formal education/training in simulation-based learning Debriefing is facilitated by someone competent in a structured debriefing framework and who observed the scenario Use formative assessment and summative evaluation as appropriate

20 INACSL Standards of Best Practice: Simulation
First published in 2011, Revised in 2013 2 new standards being released in 2015 Update coming soon Very much aligns with the findings of the NCSBN Study

21 Equipment and supplies to create a realistic environment
Enough evidence to support high fidelity simulation It isn’t about the manikin but the sights, sounds, and equipment that make the place real to a participant. Elicits an emotional responses to what is happening. Taps the senses like pretend cannot

22 Implications Dedicated Formerly trained faculty –
policy implications for BONs Financial implications for institutions Use of theory based debriefing methods What do schools choose from? Cost for training

23 Implications Adequate numbers of simulation faculty to support the learners Workload Cost Equipment and supplies to create a realistic environment Space What about Advanced Practice Education? Can they substitute any clinical hours with Simulation based on the NCSBN findings?

24 Do we change everything based on study?
Is it simulation that forced the best learning and teaching practices? Is it the infrastructure that forced the best learning and teaching practices? Why was there no difference? Anecdotal Why were the participants equally clinically competent? How controlled were the sites? Each program had their own curriculum and got to choose where and when to integrate simulations.

25 What are the Next Steps?

26 Questions?

27 References Gaba, D. (2004). The future vision of simulation in health care. Quality and Safety in Health Care, 13 (Suppl 1), 2-10. Hayden, J. K., Smiley, R. A., Alexander, M., Kardong-Edgren, S., & Jeffries, P. R. (2014). Supplement: The NCSBN National Simulation Study: A longitudinal, randomized, controlled study replacing clinical hours with simulation in prelicensure nursing education. Journal of Nursing Regulation,5(2), C1-S64. International Nursing Association for Clinical Simulation and Learning. (2013, June). Standards of Best Practice: Simulation. Clinical Simulation in Nursing, 9(6S), S3-S11. Silberman. M. (2007). The Handbook of Experiential Learning, San Francisco, CA: John Wiley & Sons.


Download ppt "Wendy Thomson, EdD, MSN, BSBA, RN, CNE, CHSE"

Similar presentations


Ads by Google