Presentation is loading. Please wait.

Presentation is loading. Please wait.

Developing measures to manage quality and safety in integrated care in New Zealand Tom Love 14 November 2013.

Similar presentations


Presentation on theme: "Developing measures to manage quality and safety in integrated care in New Zealand Tom Love 14 November 2013."— Presentation transcript:

1 Developing measures to manage quality and safety in integrated care in New Zealand Tom Love 14 November 2013

2 Why are you measuring stuff? Quality improvement? Performance management? What do you want to do with measures? Peer review and professional development? Service development and integration? Are measures linked to incentives/sanctions? For whom? Linked to public information and reporting? Linked to access to services?

3 Measuring performance: some general points Different kinds of activity have different kinds of characteristics for observing performance; There are different inherent limitations to what you can observe for different kinds of activity.

4 Typology of activity Type of activity OutputsOutcomesExample ProductionObservable Tasks are generally repetitive and stable, although some of the skills may be specialised. Eg. tax collection: the activity involved in revenue collection is directly observable, and the outcome is easily measured in total quantity of taxation collected. ProceduralObservable Not observable Skills are specialised, tasks may be stable, but outcomes are unique or are much delayed from the activity. Eg. army in peacetime. Training and capability activities are observable, but there is no way of establishing an outcome measure until a war happens. Craft Difficult to observe Observable A general set of skills are applied to unique tasks, with stable similar outcomes. Eg. audit. Specifying and monitoring every detail of investigative audit activity is difficult and complex, but the outcome in terms of audit results is easy to observe. CopingNot observable Generic skills are applied to unique tasks, but outcomes can’t be evaluated in the absence of alternatives. Success is often attained by trial and error. Eg. police maintaining order. The application of effort is complex and difficult to specify in advance, and there is not an alternative in the real world against which to assess the outcome. 4

5 Key points Outcomes are intellectually desirable measures, but there is often good reason why you can’t or shouldn’t measure them; The nature of knowledge and evidence around health care activity is highly variable, constantly changing, and often disputed; Services are often provided to an individual, and it is hard to know what the counterfactual would have been; At the individual level, numbers can be too small to provide robust statistics. 5

6 Back to why measuring? Both of: Quality improvement Performance management In different ways at different levels Decentralised: quality improvement among professionals, both individually and in multidisciplinary teams; Central view: performance management across the system and its components: the system isn’t working well unless the whole system is working well. 6

7 We don’t live in a vacuum Lots of quality improvement goes on throughout the sector But we suspect that clinical governance may be patchy, done better in some places than in others A fair bit of performance management goes on throughout the sector But we suspect that it could be better aligned to quality improvement and integration 7

8 So…. Need to: observe systems and processes across the complex system; have good information across the system; encourage reflection and learning across the system. While: promoting individual professional participation in safety and quality improvement activities; and collaborative development of integrated services in a way which improves a number of things, including quality and safety. 8

9 What to measure, how and who does it? 9

10 What to measure, and who does it? System measures Responsibility across the District for elements of those measures. A system isn’t good unless all of its components contribute Nationally defined Based upon HQSC Triple Aim, with capacity/capability element An impetus towards integration Contributory measures for improvement Largely locally determined (from a menu, which provides some definition and advise about how to manage the relevant information) Improvement measures should be chosen to contribute towards the goals captured in the system measures, but should reflect local priorities for doing so. 10

11 Constructing measures System measures Small number, nationally determined Several life-cycle based, some capability based Composite measures Mixture of outcomes and outputs Used to assess performance of systems at District, and potentially regional level Improvement measures Large number, locally chosen through alliances Largely process/activity measures for local quality improvement Some elements of quality assurance: RNZCGP Foundation Standards Alliances are the engine 11

12 Examples of measures System measure: Supported end of life Composite of: Percentage deaths in usual place of residence Average number of hospital days in the last six months of life Average number of urgent ambulance transfers in the last six months of life Contributory measures: Number of advanced care directives in place Pain intensity quantified Plan of care for pain Aperients/laxatives initiated in patients on opioids Polypharmacy 12

13 What do we expect to happen? Promote integration through: Clear articulation of what constitutes good health care across the system; Joint accountability across all participants of the system for achieving good services; Promote quality improvement through: Building on alliance structures to identify local priorities and areas for quality improvement and service development; Consistent expectations about capacity and capability of quality improvement activities; 13

14 Squaring the circle: Addressing the dilemma of: Undertaking performance measurement… While supporting quality improvement… While encouraging local collaboration In a complex environment with limited direct levers for control. There is no perfect answer to achieving these different goals at the same time, but there are better and worse trade-offs between them. 14


Download ppt "Developing measures to manage quality and safety in integrated care in New Zealand Tom Love 14 November 2013."

Similar presentations


Ads by Google