Presentation is loading. Please wait.

Presentation is loading. Please wait.

Opening the "black box" of PDSA cycles: Achieving a scientific and pragmatic approach to improving patient care Chris McNicholas, Professor Derek Bell,

Similar presentations


Presentation on theme: "Opening the "black box" of PDSA cycles: Achieving a scientific and pragmatic approach to improving patient care Chris McNicholas, Professor Derek Bell,"— Presentation transcript:

1 Opening the "black box" of PDSA cycles: Achieving a scientific and pragmatic approach to improving patient care Chris McNicholas, Professor Derek Bell, Dr Julie Reed National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) Northwest London, Imperial College London The Health Foundation United Kingdom Funded by CLAHRC NWL and Health Foundation CLAHRC NWL is…. Practical experience of improvement methods…

2 Types of evidence Key questions about the QI methodology Sources of evidence to answer those questions Theoretical evidence How and why does it work? What is the underlying ‘programme theory’? Descriptions of the methodology's intended mechanism or action, setting out the programme logic or intended causal sequence and drawing on appropriate social science theory Empirical evidence When, for whom and how well does it work? What effects does it have? What does it cost? Qualitative and quantitative evaluations of the methodology's implementation, using rigorous and robust comparative methods to quantify effects, and undertaken independently Experiential evidence What is it like to use? What has been learned about its application in a wide variety of settings or contexts? Descriptive accounts of the methodology in use, synthesis of practitioner experience and feedback, collation of learning and interchange among networks of users A more sceptical and scientifically rigorous approach to the development, evaluation and dissemination of QI methodologies is needed, in which a combination of theoretical, empirical and experiential evidence is used to guide and plan their uptake.  Similarity of improvement methods Need for theoretical, empirical and experiential evidence Improvement science research not outcomes of a single project Need broader overarching look at improvement methods

3 Outline Theory of PDSA (Systematic Review) Reality of PDSA
(International Observational Study) Revisiting Theory PDSA Empirical Experiential Start with theory Present our preliminary results Revisit thinking about theory

4 PDSA Cycles – Why the interest?
Why PDSA – crux of change, scientific method, popular, varied effectiveness, little overarching view of its application What systematic review was – theory – say 5 things Findings – brief/variation Need to unpack Black box The Plan-Do-Study-Act (PDSA) cycle is a common tool to test changes in a pragmatic and scientific fashion. The effectiveness and application of the method, however, is varied with a lack of adherence to key principles (1). This research examines facilitators and challenges to achieving good quality use in improvement initiatives and at the influence of organizational context (2).

5 Research Questions What are the perceived functions and benefits of PDSA method? Are these functions and benefits applied in practice? How do social and organisational contextual factors influence the improvement work and the use of PDSA in practice? 1. Read Qs

6 An International Observational Study
4 International Sites Specific improvement initiatives, Organisational improvement support, Broader organisational context Methods: Interviews (65), Observations (70 hours), Focus groups (6), Document analysis (PDSA cycles) Technical, Social and Contextual research lenses International qualitative observational study Spending prolonged period of time with frontline improvement teams Theory – technical (systematic review), social (boundary objects/communities of practice), context – MUSIQ – super quick – only for those in the know A qualitative observational study comparing improvement initiatives in 3 healthcare organizations based in Australia, UK and USA. Data collection included non-participant observation (n=70 hours), document analysis, semi-structured interviews (n=65) and focus groups (n=6). Participants included improvement initiative members and organization leaders. Thematic analysis included both deductive coding to build on established theoretical frameworks (1) (2) and inductive coding to support the identification of new themes and concepts. MUSIQ (Model for Understanding Success in Quality) Kaplan et al, 2013)

7 Observing the reality of PDSA
Feature Org 1 Org 2 Org 3 Org 4 Documentation of cycles Iterative cycles Start testing on small scale Use of regular data over time Prediction-based test of change Selected Themes from Preliminary Analysis Using quantitative data to inform progression of cycles Managing complexity of emergent learning - scaling up and iterative cycles Social factors influencing PDSA use Varied compliance with technical theory Documentation – question value which results in low completition Our observations reveal the intricacy of PDSA in practice The 3 themes we have selected to present today Phrase as related to audience – I’m sure you have experienced the challenges of… type thing All recognized the value of PDSA cycles to structure tests of change and empower staff to lead change in complex environments.

8 Using quantitative data to inform progression of cycles
The nitty gritty of having data metrics in a database… …we had a concept, we revisited it and we said we need this; by the way, that data isn’t currently being captured…that had to be designed, that had to be added, the data started being collected… and maybe three, four, five months goes by when all that is happening and now our data just started last week… And then, of course, the physicians will get frustrated, because it’s, like - ‘I thought we defined this months ago’. Data concept to data metric takes time Disappointed physician – read quote This prevented effective learning and iteration as teams did not effectively pause and conduct a “study” stage.

9 Scale Up and the Disappearing PDSA
DATA ? Daily verbal reminders Reminders in notes Scale of testing Formal education All patients for another week What we are taught Not linear – complex/messy Blurred as scale increases Understand why this is All patients for one week 5 patients 3 patients 1 patient Time

10 Unpacking a “single” large scale PDSA
Doctor Availability Macrosystem Coding of Patient Notes Job Plans Scale of testing Data Availability Process 2 Not Following Plan Sphere of contextual influence Ward A 1 example – was described to us as a single large scale PDSA – but in reality mutliple PDSA Taking place different levels of influence – think how much detail to go in to – 1 or 2 examples? Positive that PDSA can help navigate this complexity Data part of this challenge PDSA cycles are typically taught as a smooth chain of cycles testing change. Changes are adopted, abandoned or adapted. These decisions inform future cycles. Scale of testing grows as confidence of success, whatever that may be, grows. Process 1 Following Plan Completion of Post Take Notes Microsystem Time

11 “its all about social skills – the technical are important but you wont be successful without social skills” Using the Plan to negotiate different perspectives Prior QI experience “That’s actually where I think the most value comes in… you have to have a conversation with people to realise most of us don’t hold with all of it, right? …that’s a two-hour conversation sometimes …just getting to that point is what takes a long time, but also where… the most valuable conversation can happen.” Engagement tactics No Prior QI experience "If I got my laptop out in the meeting and went through a PDSA, people wouldn't come back. It's a fine line between being useful and pushing people away” Example of social skills need to engage – different tactics employed When no prior experience – don’t mention jargon, use words like testing – ask people for predictions When experienced – use as tool to negotiate different perspectives Also from an organisational perspective PDSA was method of choice as it was seen to empowers users to take responsibility and provides a freedom to act Developing plan Negotiating the ideal and real worlds Understanding context – QI and frontline staff collaboration Communicating plan to those doing test Buddy system – toilet break? Director promoting tests Ensuring someone is there to ‘observe’ test to maximise learning Unclear inclusion criteria Impartial observers Reliability of data being reported Learning the numbers game (learning vs performance) Continual enquiry Reflection on results against predictions Does data relate to learning and/or interventions Predictions routinely made Deciding implications for next steps Limited sphere of Influence/opposing organisation initiatives Senior level buy-in Agreeing action plan for next steps How to prioritise/limited scope – root cause vs symptom Plans for ramping

12 Enhancing the theory of PDSA
Empirical Experiential PDSA as complex social-technical tool PDSA as boundary object between different groups Using quantitative data in social and contextual dependent world How can we structure the management of PDSA cycles? How do we prepare people for the reality of using PDSA? What are the generic implications for change management, learning organisations and knowledge mobilisation? These practical examples are intended to support better use of PDSA cycles and further ground the method a scientific approach to improve patient care. This work outlines an important enquiry into the fidelity of use of improvement methods by comparing their application to the key principles of the method and exploring how fidelity of use is influenced by facilitators and barriers within the local context. This study can act as a research template to unpack the “black box” of PDSA cycle and other QI methods to inform education and conduct of QI. By furthering this field of research we are able to better understand how the use of improvement methods influences improvement success. 12

13 Structuring Complexity Learning and Improvement
Usability Applicability Scalability Current process Ability to measure Sustainability (self-sufficient) ‘Maintaining’ ? Daily verbal reminders Reminders in notes Scale of testing Formal education “Implementing” All patients for another week What we are taught Not linear – complex/messy Blurred as scale increases Understand why this is All patients for one week “Testing” 5 patients 3 patients 1 patient Time

14 Walshe, K. (2009). Pseudoinnovation: the development and spread of healthcare quality improvement methodologies. International journal for quality in health care, 21(3), Taylor, M. J. et al (2013) Systematic review of the application of the plan-do-study-act method.BMJ Quality & Safety. doi: /bmjqs Kaplan, H. C. et al (2012) The Model for Understanding Success in Quality (MUSIQ). BMJ Quality & Safety, 21(1), 13-2 Ogrinc, G., & Shojania, K. G. (2013). Building knowledge, asking questions. BMJ quality & safety, bmjqs-2013. Funders: National Institute of Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) Northwest London The Health Foundation


Download ppt "Opening the "black box" of PDSA cycles: Achieving a scientific and pragmatic approach to improving patient care Chris McNicholas, Professor Derek Bell,"

Similar presentations


Ads by Google