Application of research to quality improvement Helen Crisp The Health Foundation, London UK.

Slides:



Advertisements
Similar presentations
Tools for Change Plan, Do, Study, Act The PDSA Cycle Explained
Advertisements

Writing up improvement work for impact and implementation Helen Crisp Assistant Director Research and Evaluation.
January 3-4th 2007University of Plymouth Academic Dissemination with a focus on HELP CETL Award Holders Mark Stone, HELP CETL Director Higher Education.
NICE Guidance and Quality Standard on Patient Experience
PROJECT MANAGEMENT AND FUNDING Steven Lugg. 1. Identify What You Need Have the community expressed concerns or complaints? Have you identified a group.
Doug Altman Centre for Statistics in Medicine, Oxford, UK
Project Monitoring Evaluation and Assessment
Introducing the NHS Change Model. Why the NHS needs a Change Model Massive change in the NHS over past 10 years – much more to come Massive change now.
Critical Appraisal Dr Samira Alsenany Dr SA 2012 Dr Samira alsenany.
Quality Improvement Methods Greg Randolph, MD, MPH.
Benefits for using a standardised risk management framework to risk assess Infection Prevention and Control Sue Greig Senior Project Officer National.
Lecture 2: Project Concept Document
Criteria and Standard.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
PROGRAMME Audits for the PGA in Professional Skills Thursday 26 August, CSB UHCW 2.30 – 3.15pmPGA and Audit Dr Paul O’Hare 3.15 – 3.30 pmBreak 3.30 – 4.30pmWorkshops.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
OSSE School Improvement Data Workshop Workshop #4 June 30, 2015 Office of the State Superintendent of Education.
Reporting and Using Evaluation Results Presented on 6/18/15.
Presenter-Dr. L.Karthiyayini Moderator- Dr. Abhishek Raut
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
Early Intervention EYFS Framework Guide. Early intervention The emphasis placed on early intervention strategies – addressing issues early on in a child’s.
Perioperative fasting guideline Getting it into practice Getting started.
The Audit Process Tahera Chaudry March Clinical audit A quality improvement process that seeks to improve patient care and outcomes through systematic.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Ensuring the Fundamentals of Care in Family Planning and Reproductive Health Services MODULE 2 Facilitative Supervision for Quality Improvement Curriculum.
Outcome Based Evaluation for Digital Library Projects and Services
Preventing Surgical Complications Prevent Harm from High Alert Medication- Anticoagulants in Primary Care Insert Date here Presenter:
Systematic Reviews.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Webinar 8: Engaging Your Colleagues. Summary of Last Week’s Call Updated you on the webinar specifically for surgeons. Checked in with participants. Reviewed.
Introduction to Evaluation Odette Parry & Sally-Ann Baker
Group Technical Assistance Webinar August 5, CFPHE RESEARCH METHODS FOR COMPARATIVE EFFECTIVENESS RESEARCH.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
Keys to Successful Marketing  Must understand and meet customer needs and wants  To meet customer needs, marketers must collect information.
How to read a scientific paper
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
Alessandro Volpe SCDU Urologia Università del Piemonte Orientale AOU Maggiore della Carità Novara How to write a scientific paper Title, abstract, bibliography.
Systems Change Using Quality Improvement: From a “Good Idea” to a Practice Culture Artwork by Caroline S. © 2010 American Academy of Pediatrics (AAP) Children's.
What do we mean by evidence-informed practice? Alison Petch What works in dementia care? April
February February 2008 Evidence Based Medicine –Evidence Based Medicine Centre –Best Practice –BMJ Clinical Evidence –BMJ Best.
IHI Methodology – Is it really a breakthrough? Kaye KI, Maxwell DJ, Graudins L, on behalf of the NSW Therapeutic Assessment Group (NSW TAG) Drug Use Evaluation.
Using drug use evaluation (DUE) to optimise analgesic prescribing in emergency departments (EDs) Karen Kaye, Susie Welch. NSW Therapeutic Advisory Group*
Transforming Patient Experience: The essential guide
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Healthcare Quality Improvement Dr. Nishan Sharma University of Calgary, Canada March
Healthcare Quality Improvement Dr. Nishan Sharma University of Calgary, Canada October
Research article structure: Where can reporting guidelines help? Iveta Simera The EQUATOR Network workshop 10 October 2012, Freiburg, Germany.
Making it Count! Program Evaluation For Youth-Led Initiatives.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 1 Research: An Overview.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Session 2: Developing a Comprehensive M&E Work Plan.
 Friends and Family Test (FFT) -single question ‘would you recommend…’  The Adult National Inpatient Survey (AIPS) - AIPS uses validated questions based.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Issues and challenges to scoping and focusing the question ESQUIRE Qualitative Systematic Review Workshop University of Sheffield 8 September 2011 Janet.
Training for organisations participating in Peer Review of Paediatric Diabetes.
Outcomes – Gaye Powell. * “... a predicted measure of change that demonstrates a valid and significant therapeutic impact following an agreed intervention.”
Insert name of presentation on Master Slide The Model for Improvement Wednesday 16 June 2010 Presenter: Dr Jonathon Gray.
National Children’s Commissioning and Contracting Training Conference An Integrative Quality System for Positive Environments for Children and Young People.
Context and Problem Effects of Changes Strategy for Change Aim: To reduce the length of handover by standardising the quality of information transmitted.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Insert name of presentation on Master Slide The Quality Improvement Guide Insert Date here Presenter:
Clinical practice guidelines and Clinical audit
Person Centred Care in NHS Wales
Thesis writing Session 2017
Measurement for Improvement
Please feel free to add your organisation’s logo in the title slide and add the name of your organisation at the bottom of every slide. Life after Stroke.
CITE THIS CONTENT: RYAN MURPHY, “QUALITY IMPROVEMENT”, ACCELERATE UNIVERSITY OF UTAH HEALTH CURRICULUM, JANUARY 30, AVAILABLE AT: 
It’s OK to ask questions
CITE THIS CONTENT: RYAN MURPHY, “QUALITY IMPROVEMENT”, ACCELERATE UNIVERSITY OF UTAH HEALTH CURRICULUM, JANUARY 30, AVAILABLE AT: 
Presentation transcript:

Application of research to quality improvement Helen Crisp The Health Foundation, London UK

Being ‘scientific’ about health care improvement 2 Improvement needs to be as ‘evidence-based’ as any other aspect of health care provision Effective, theory-based interventions using tested methods Change demonstrated by robust measures: process clinical outcomes patient experience

Why is research important for improvement? 3 To understand what we’re trying to do To measure what we are doing and if it’s working To report our work so others can learn

Understanding what we’re trying to do 4 To boost chances of getting desired results - improvement interventions need to: be focused on a well-defined issue use research evidence on approaches that have been tried build on previous work that showed good results and learn from interventions demonstrated as ineffective

E.g. You want to improve hand washing rates 5 What approaches have been tried elsewhere? What has shown little effect? What works? Where did it work? Is it likely to work for us?

Finding relevant research 6 It can overwhelming! Refine your search terms Look for: systematic reviews meta analyses Work with your local research leads Sign up for regular research up-dates:

Health Systems Evidence McMaster University - Canada 7

Other sources 8 Good practice repositories Examples: NICE Quality and Productivity proven case studies BMJ Quality Reports Other national patient safety and quality improvement agencies

9

10

11

Poll 12 To date, has your improvement work been based on research evidence? YES, TOTALLY YES, TO SOME EXTENT NO, WE GENERATE OUR OWN IDEAS NOT SURE

Benefits of research-based improvement 13 Not starting from scratch Benchmark your results against reported findings Counters the inevitable question from clinicians when introducing a change: “What’s the evidence for this?”

14 THE EVIDENCE IS OUT THERE

Using research methods to do improvement work 15 To improve effectiveness of implementation; improvement programmes need to be based on evaluated methods, based on explicit theories about how and why the intervention is expected to work.

16 Introducing ‘Theory of change’ Image credit: Sidney Harris “I think you should be more explicit here in step two”

What is a ‘Theory of change? 17 A comprehensive description and illustration of how and why a desired change is expected to happen in a particular context. The theory of change sets out explicit statements on the components of an improvement programme (its activities or interventions) and how these are expected to lead to achieving the desired goals. First identifying the desired long-term goals, the theory of change works back from these to identify all the conditions (outcomes) that must be in place (and how these related to one another causally) to reach the goals.

Behaviour change:- Chronic conditions more effectively managed OUTPUTS Fewer visits to surgery ACTIVITIES OUTCOMES GOAL Advertising campaign Equitable access to resources and interventions Pain clinics Reduction in medications over time # Self Referrals Recognising & reporting of adverse effects Physical activity workshops ENABLING FACTORS Interventions targeted at specific population; Local stakeholders buy- in to champion, Funding continues to sustain, Effective monitoring and evaluation THEORY OF CHANGE : Managing chronic conditions; pain, fatigue, shortness of breath Dietary and Nutritional advice Lower BMI Individuals feel more empowered Accuracy in self administering Reduction in pain < Negative emotion s Psychotherapy Smoking cessation workshops

Developing a theory of change 19 Keep it simple Use language that’s easy to understand Keep your diagram on one side of paper/screen view Keep it relevant Focus on the key elements of the intervention Explore the assumptions linking action and expected outcomes Keep it updated Return to the theory of change at regular intervals What has changed during the implementation? Does the theoretical link between actions and outcomes hold up in practice?

Measurement is key to quality improvement research 20

Measuring improvement 21 Unless you measure you do not know if there has been an improvement What to measure and how to measure it? Measure key elements of the intervention Use routinely collected data where possible Be precise about data definitions Measure over time Use robust techniques such as statistical process control charts

Typical measures 22 Process measures error rates e.g. prescription errors compliance rates e.g. completing a checklist Clinical outcome measures complication rates e.g. of a surgical procedure infection rates – healthcare associated infections Resource use length of hospital stay number of medications prescribed Patient experience involved in decisions about care treated with dignity and respect

Evaluating improvement interventions 23 Key evaluation question: Did the improvement intervention fulfil its intended objectives? Sub-questions: How was this achieved? What resources did it take? What unintended results were there?

Evaluation approaches 24 Summative Summarises the intervention effect at the end Formative Findings shared and help to shape the intervention Rapid cycle Frequent review of effectiveness of intervention Developmental Intervention is still developing, all aspects reviewed and changes made in response

External evaluation 25 Often part of large scale, national or multi-site change programmes Independently commissioned from specialist teams Great learning opportunity (NOT a threat!) Share experience - the good and the challenging

More at: what-consider what-consider 26

Poll 27 Do you think it is important that quality improvement work is written up for publication? YES NO NOT SURE

Helping to build the evidence base 28 This section based on presentations by Dr Kaveh Shojania, Editor in Chief, BMJ Quality and Safety Better reporting of improvement work will: Help spread successful improvement interventions Prevent wasted effort on repeating interventions that don’t work

Sharing and learning more 29 Reports on improvement work need not only results but also: how the initiative was designed the setting where it was implemented detail on the core components measures and data used to measure the change challenges overcome along the way how they were overcome what the team would do differently in the future

Reporting bias 30 When reporting a successful intervention many improvement reports amount to: “See, we did X!” versus “Here’s what we had to do to achieve X” Higher tendency to write up reports and submit papers and abstracts when the improvement is ‘successful’ We can a lot from what didn’t work

Reporting to facilitate spread 31 Improvement reports need to provide enough detail: to convey credibly that something worked to give insight on the action needed to replicate the results in another clinical setting

Credibility and replication 32 Too often improvement reports lack important details about key components of intervention and institutional context − Readers cannot know if it’s worth trying in their setting No information is given on barriers or problems to implementation − No improvement effort works immediately, this absence decreases credibility

A typical QI report 33 Introduction Hospital infections affect thousands each year Hospital staff do not wash their hands consistently We implemented a multi-faceted strategy: Staff education Clinical champions Empowering patients to ask staff if they have washed their hands Methods Briefly stated design, data collection strategy and main outcomes, plus some mention of PDSA Results We improved hand hygiene by 50% Discussion Patient empowerment can be effective

What is lacking here? 34 Introduction Hospital infections affect thousands each year Hospital staff do not wash their hands consistently We implemented a multifaceted strategy: Staff education Clinical champions Empowering patients to ask staff if they have washed their hands No connection between the introduction material and specific features of the intervention

A better approach 35 Introduction Commonly identified barriers to hand hygiene compliance include A, B, and C Staff education, clinical champions, and empowering patients address A, B, and C by doing X, Y, and Z This introduction makes clear what factors explain poor hand hygiene And, it makes explicit why the intervention includes these ingredients This “theory for the intervention” will pay off in writing the report and interpreting the results

Is it clear what you did? 36 Methods Briefly stated design, data collection strategy and main outcomes, plus some mention of PDSA Results We improved hand hygiene by 50% Discussion Patient empowerment can be effective ‘PDSA’ needs context to make sense! Simply saying; ‘We carried out three PDSA cycles’ is not informative. What did the ‘study’ of what you had ‘done’ reveal and how did you ‘act’ as a result?

A better approach 37 Method After the first round of staff education we reviewed the delivery mechanism and feedback from participants, using PDSA methodology. It was reported that timing of training sessions was an issue in getting staff attendance, so the next sessions were planned with ward managers. Participants wanted more visual material to illustrate key points - these were designed with staff and used in subsequent sessions. This provides more detail which makes the report credible Others are likely to have the same issues and could avoid making the same mistakes

Points to consider: 38 When do you start ‘writing up’? How to capture the key components of the improvement initiative? When barriers arise - how do you record these - and the action to overcome them?

Some tips for writing up 39 Writing always takes longer than you think – don’t leave it to the last few weeks Robust data collection from the outset is vital Keep an ‘improvement diary’ to help capture information as you go along, particularly the adjustments

Involve others 40 Include different perspectives in your write up; not just the improvement lead other staff involved staff not involved but affected by the change service users

Consider the audience and where to publish 41 Core information remains but different emphasis for: A report to the funder Academic publication in a peer reviewed journal Publication in a professional practice magazine Beyond text: Photos, videos, animations - bring the work to life

A tool to help 42 SQUIRE guidelines Standards for Quality Improvement Reporting Excellence Checklist of points to consider when writing up improvement work BUT Don’t leave it until you’ve completed the work Use guidelines to consider what data to capture as you go along

43

44

Download SQUIRE guidelines from: 45

Spreading the word 46 Professionals listen to their peers Think of a range of approaches - and use every opportunity: Professional seminars Conferences bulletins and newsletters Blogs Twitter

Conclusion 47 Find and use existing evidence Actively use robust research methods Contribute to building the evidence

Conclusion 48 Find and use existing evidence Actively use robust research methods Build the evidence