Presentation is loading. Please wait.

Presentation is loading. Please wait.

Performance indicators: good, bad, and ugly The report of the Royal Statistical Society working party on Performance Monitoring in the Public Services.

Similar presentations


Presentation on theme: "Performance indicators: good, bad, and ugly The report of the Royal Statistical Society working party on Performance Monitoring in the Public Services."— Presentation transcript:

1 Performance indicators: good, bad, and ugly The report of the Royal Statistical Society working party on Performance Monitoring in the Public Services http://www.rss.org.uk/PDF/PerformanceMonitoring.pdf

2 “Performance monitoring done well is broadly productive for those concerned. Done badly, it can be very costly and not merely ineffective but harmful and indeed destructive.” Value of Performance Monitoring  Public sector PM plays three roles:  Research  Management  Democratic

3 Methodology … … adopt a rigorous approach  Data sources:  Sample surveys should be designed, conducted and analysed in accordance with statistical theory and best practice  Administrative data should be fully auditable  Concepts, questions, etc:  should be comparable and harmonised where possible

4 Methodology … … adopt a rigorous approach  Indicators and targets:  Precise: accurate enough to show reliably when change has occurred  Clear: defining all key concepts used  Unambiguous  Consistent over time  Clear: documenting fully any changes to definitions or methods

5 Targets … … seek practitioner input  Motivational but irrational targets may demoralise  Ambitious but achievable targets require:  A good understanding of the practicalities of delivery on the ground … … based on consultation with practitioners  A good understanding of the data

6 Targets … … avoid extreme value targets  0% or 100% targets can lead to perverse outcomes, demoralisation, and lead to disproportionate resources being used  An example from the report:  “No patient shall wait in A&E for more than 4 hours”  This becomes irrelevant as soon as one patient does wait more than 4 hours  A&E staff may have very sound reasons for making a small number of people wait longer

7 Targets … … monitor for perverse outcomes  Targets can lead to practitioners playing the system rather than improving performance to meet poorly conceived targets  An example from the report:  An indicator for prisons is the number of “serious” assaults on prisoners  “Serious” = proven prisoner-on-prisoner assault  The indicator would improve if prisons reduced their investigations into alleged assaults

8 Do not ignore … … uncertainty or variability  Single numbers provide simple answers to complex questions  Natural variability, outliers, recording errors, statistical error (i.e. confidence intervals around sample estimates) all need to be considered  Uncertainty and variability need to be clearly presented An example taken from the from the work of David Spiegelhalter …

9 Cannot be 95% confident about any hospital being in top quarter or bottom quarter POST/RSS meeting on Performance Monitoring in the Public Services (March 2004) … ranks for 51 hospitals with 95% intervals: mortality after fractured hip

10 Do not ignore … … the distribution  Performance Indicators are one number  Single number summaries of data can be misleading  An example from the report:  “Number of patients waiting more than 4 hours”  The whole distribution needs viewing to understand the indicator  Example: has progress been achieved by getting most people seen in 3 hours 59 minutes but some not for 10 hours?

11 Do not confuse statistical significance for practical importance  Statistical significance  depends on sample size: very small differences can be statistically significant for very large samples  measures a property of the statistics not the practical importance of any relationship observed  A difference can be statistically significant but not of practical importance

12 Consider not setting a target until data are well understood  The statistical properties of an indicator will be much better understood after one or two rounds of analysis  It may therefore be sensible to wait before setting a target

13 Document everything  Others should be able to replicate procedures  Establish a ‘PM Protocol’, including:  Objectives  Definitions  Information about context  Survey methods / information about data  Risks of perverse outcomes  How the data will be analysed  Components of variation  Ethical, legal and confidentiality issues  How, when and where data will be published


Download ppt "Performance indicators: good, bad, and ugly The report of the Royal Statistical Society working party on Performance Monitoring in the Public Services."

Similar presentations


Ads by Google