Performance indicators: good, bad, and ugly The report of the Royal Statistical Society working party on Performance Monitoring in the Public Services.

Slides:



Advertisements
Similar presentations
Analysis by design Statistics is involved in the analysis of data generated from an experiment. It is essential to spend time and effort in advance to.
Advertisements

 Is extremely important  Need to use specific methods to identify and define target behavior  Also need to identify relevant factors that may inform.
Quality Improvement/ Quality Assurance Amelia Broussard, PhD, RN, MPH Christopher Gibbs, JD, MPH.
Effective Project Management: Traditional, Agile, Extreme
Concept of Measurement
Chapter 28 Design of Experiments (DOE). Objectives Define basic design of experiments (DOE) terminology. Apply DOE principles. Plan, organize, and evaluate.
Performance Measurement: Indicators Sara Grainger Statistical Support for Public Bodies Office of the Chief Statistician.
AP Statistics: Chapter 23
FINAL REPORT: OUTLINE & OVERVIEW OF SURVEY ERRORS
Southampton Education School Southampton Education School Dissertation Studies Rigour, Ethics, & Risk.
CORRELATIO NAL RESEARCH METHOD. The researcher wanted to determine if there is a significant relationship between the nursing personnel characteristics.
Performance indicators: good, bad, and ugly The report of the Royal Statistical Society working party on performance monitoring in the public services,
Chapter 7 Database Auditing Models
RESEARCH DESIGN.
Are the results valid? Was the validity of the included studies appraised?
McGraw-Hill © 2006 The McGraw-Hill Companies, Inc. All rights reserved. The Nature of Research Chapter One.
QUALITY ASSURANCE Reference Intervals Lecture 4. Normal range or Reference interval The term ‘normal range’ is commonly used when referring to the range.
Standardization and Test Development Nisrin Alqatarneh MSc. Occupational therapy.
Chapter Nine Copyright © 2006 McGraw-Hill/Irwin Sampling: Theory, Designs and Issues in Marketing Research.
Evaluating a Research Report
Database Security and Auditing: Protecting Data Integrity and Accessibility Chapter 7 Database Auditing Models.
Experimental Design If a process is in statistical control but has poor capability it will often be necessary to reduce variability. Experimental design.
Understanding Capacity Demand and Flow Essential measures and processes for understanding healthcare.
111 Synthesis of Questionnaires. Thematic concentration  Most of the new member states support the suggested principle while maintaining the element.
1 The Good, the Bad, and the Ugly: Collecting and Reporting Quality Performance Data.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
Introduction Chapter 1 and 2 Slides From Research Methods for Business
JS Mrunalini Lecturer RAKMHSU Data Collection Considerations: Validity, Reliability, Generalizability, and Ethics.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
ABRA Week 3 research design, methods… SS. Research Design and Method.
Roles and Responsibilities of the IRO. Role and Responsibilities of IRO When consulted about the guidance, children and young people were clear what they.
Marshall University School of Medicine Department of Biochemistry and Microbiology BMS 617 Lecture 15: Sample size and Power Marshall University Genomics.
Features of science revision
Sampling and Sampling Distribution
Dr.V.Jaiganesh Professor
Audit of predetermined objectives
Writing a sound proposal
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Decade of Roma Inclusion Progress monitoring
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Towards more flexibility in responding to users’ needs
Quality Management chapter 27.
THE FIELD OF SOCIAL PSYCHOLOGY
Presenter: Christi Melendez, RN, CPHQ
Evaluating performance management
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
AXIS critical Appraisal of cross sectional Studies
Safe Systems of Work.
Monitoring and Evaluation using the
What is Research? A research study is a study conducted to collect and analyse information in order to increase our understanding of a topic or an issue.
Writing the executive summary section of your report
Data Collection and Sampling
Predetermined Objectives – 2013/14
Assuring the Quality of your COSF Data
Quality Assurance Reference Intervals.
Reliability, Validity, and Bias
Research Methods in Education Session 7
RESEARCH METHODOLOGY ON ENVIRONMENTAL HEALTH PRACTICE IN WEST AFRICA
The Quality Control Function
EAST GRADE course 2019 Introduction to Meta-Analysis
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Presenter: Kate Bell, MA PIP Reviewer
Building Valid, Credible, and Appropriately Detailed Simulation Models
The objective of this lecture is to know the role of random error (chance) in factor-outcome relation and the types of systematic errors (Bias)
Managerial Decision Making and Evaluating Research
GSBPM AND ISO AS QUALITY MANAGEMENT SYSTEM TOOLS: AZERBAIJAN EXPERIENCE Yusif Yusifov, Deputy Chairman of the State Statistical Committee of the Republic.
“Seven-minute Staff Meeting”
Meta-analysis, systematic reviews and research syntheses
Programme 1: Responsibilities
Assuring the Quality of your COSF Data
Presentation transcript:

Performance indicators: good, bad, and ugly The report of the Royal Statistical Society working party on Performance Monitoring in the Public Services

“Performance monitoring done well is broadly productive for those concerned. Done badly, it can be very costly and not merely ineffective but harmful and indeed destructive.” Value of Performance Monitoring  Public sector PM plays three roles:  Research  Management  Democratic

Methodology … … adopt a rigorous approach  Data sources:  Sample surveys should be designed, conducted and analysed in accordance with statistical theory and best practice  Administrative data should be fully auditable  Concepts, questions, etc:  should be comparable and harmonised where possible

Methodology … … adopt a rigorous approach  Indicators and targets:  Precise: accurate enough to show reliably when change has occurred  Clear: defining all key concepts used  Unambiguous  Consistent over time  Clear: documenting fully any changes to definitions or methods

Targets … … seek practitioner input  Motivational but irrational targets may demoralise  Ambitious but achievable targets require:  A good understanding of the practicalities of delivery on the ground … … based on consultation with practitioners  A good understanding of the data

Targets … … avoid extreme value targets  0% or 100% targets can lead to perverse outcomes, demoralisation, and lead to disproportionate resources being used  An example from the report:  “No patient shall wait in A&E for more than 4 hours”  This becomes irrelevant as soon as one patient does wait more than 4 hours  A&E staff may have very sound reasons for making a small number of people wait longer

Targets … … monitor for perverse outcomes  Targets can lead to practitioners playing the system rather than improving performance to meet poorly conceived targets  An example from the report:  An indicator for prisons is the number of “serious” assaults on prisoners  “Serious” = proven prisoner-on-prisoner assault  The indicator would improve if prisons reduced their investigations into alleged assaults

Do not ignore … … uncertainty or variability  Single numbers provide simple answers to complex questions  Natural variability, outliers, recording errors, statistical error (i.e. confidence intervals around sample estimates) all need to be considered  Uncertainty and variability need to be clearly presented An example taken from the from the work of David Spiegelhalter …

Cannot be 95% confident about any hospital being in top quarter or bottom quarter POST/RSS meeting on Performance Monitoring in the Public Services (March 2004) … ranks for 51 hospitals with 95% intervals: mortality after fractured hip

Do not ignore … … the distribution  Performance Indicators are one number  Single number summaries of data can be misleading  An example from the report:  “Number of patients waiting more than 4 hours”  The whole distribution needs viewing to understand the indicator  Example: has progress been achieved by getting most people seen in 3 hours 59 minutes but some not for 10 hours?

Do not confuse statistical significance for practical importance  Statistical significance  depends on sample size: very small differences can be statistically significant for very large samples  measures a property of the statistics not the practical importance of any relationship observed  A difference can be statistically significant but not of practical importance

Consider not setting a target until data are well understood  The statistical properties of an indicator will be much better understood after one or two rounds of analysis  It may therefore be sensible to wait before setting a target

Document everything  Others should be able to replicate procedures  Establish a ‘PM Protocol’, including:  Objectives  Definitions  Information about context  Survey methods / information about data  Risks of perverse outcomes  How the data will be analysed  Components of variation  Ethical, legal and confidentiality issues  How, when and where data will be published