Presentation is loading. Please wait.

Presentation is loading. Please wait.

RCPath Demand Optimisation Project: update

Similar presentations

Presentation on theme: "RCPath Demand Optimisation Project: update"— Presentation transcript:

1 RCPath Demand Optimisation Project: update
Prof Tony Fryer University Hospital of North Staffordshire Keele University

2 Why manage demand? Meeting the new RCPath proposed KPIs around Demand Management ‘The laboratory shall implement a system of demand management; this shall be designed to both to reduce the number of unnecessary tests and to help to ensure that appropriate tests are used.’ Evidence 1. Published statement of system of demand management. 2. Evidence of audit (by monitoring activity of testing) against agreed statement.

3 The Toolkit structure 6 sections 27 recommendations
TK1: Where do we start: Developing systems to categorise, define and quantitate ‘inappropriate’ requests. TK2: The simple option: Within-laboratory management of requests. TK3: Prevention is better than cure: Ensuring requests are appropriate before they reach the laboratory. TK4: The right process: ensuring that the correct sample is collected TK5: The key: Unlocking the potential of electronic requesting. TK6: Does it make a difference: Assessing the impact of interventions.

4 RCPath National Demand Management Toolkit project
Process: Start with Demand Management Toolkit recommendations from Fryer AA, Smellie WS. (2013) Assess views from stakeholder group on each recommendation regarding: Scope of Impact (eg volume/cost of tests affected) Clinical Importance (clinical benefit if implemented) Ease of Implementation Generate list of priority themes to take forward to next stage

5 TK1 Stakeholder feedback
Where do we start: Developing systems to categorise, define and quantitate ‘inappropriate’ requests. Recommendation 1: Agree criteria with stakeholders regarding appropriate test utilisation and definitions for inappropriate use. Feedback: Impact 3.4, Importance 4.3, Implementation 3.0 Need to be clear on which tests are priorities & standardise these first. Recommendation 2: Devise a strategy for reviewing new guidance as it is published. Consider using a short proforma to identify impacts on the laboratory. Feedback: Impact 3.7, Importance 4.1, Implementation 3.0 Consistent support. Considered important, but implementation may be a challenge. May be assisted by Trainees’ Committee template. Recommendation 3: Develop a consistent strategy for reviewing workload figures. Feedback: Impact 4.0, Importance 4.3, Implementation 2.3 Question needs to be clear: what figure need to be collected & why? What will they be used for? May need link to standardised report to requestors. Recommendation 4: Develop an audit schedule that includes assessment of inappropriate requesting Feedback: Impact 4.0, Importance 4.7, Implementation 2.1 Clearly seen as important, but implementation seen as a challenge. May be possible with Minimum Re-test Interval examples or closer working with Audit groups to define clear topics.

6 TK2 Stakeholder feedback
The simple option: Within-laboratory management of requests. Recommendation 5: Review repertoire on a systematic basis, perhaps once every 6 months. Feedback: Impact 3.7, Importance 3.3, Implementation 3.4 Not seen as clinically important, but easier to do. May not be a core theme for demand management, but represents good practice for laboratories. Recommendation 6: Systematically review the top 20 most expensive tests (cost per test) to determine whether they would benefit from vetting. Feedback: Impact 3.8, Importance 3.3, Implementation 2.7 While other surveys (eg KUBS, ACB/RCPath Spotlight meeting) suggest that test vetting is the most commonly used demand management tool, and unpublished reports suggest it is cost effective, stakeholder feedback suggest little impact & difficulties in implementation. May represent lack of standardisation on what should be vetted as this may vary between laboratories depending on repertoire and IT systems. Recommendation 7: Explore options to utilise IT to apply reflex testing. Examine the possible added clinical value of reflexive testing. Evaluate the effectiveness of such systems to ensure improved test appropriateness. Feedback: Impact 4.1, Importance 4.1, Implementation 2.4 Seen as important, but less easy to implement, possibly due to IT-related issues. May require dialogue with laboratory IT service providers & equipment manufacturers. Recommendation 8: Implement laboratory computer-based IT solutions to maximise the identification of duplicate requests within nationally-agreed minimum re-test intervals. Where possible, application of acceptance/rejection criteria and comments should be automated. Feedback: Impact 3.0, Importance 3.7, Implementation 3.7 Impact limited as the sample has already been taken, but relatively easy to implement. May be better to focus on order comms (pre-laboratory) aspects & leave laboratories to take this forward in the RCPath MRI programme.

7 TK2 Stakeholder feedback (1)
Prevention is better than cure: Ensuring requests are appropriate before they reach the laboratory. Recommendation 9: Cultivate links with all key specialties (including primary care) to allow establishment of suitable educational forums and ensure educational interventions have the backing of clinical teams. Feedback: Impact 3.4, Importance 4.0, Implementation 2.0 Seen as difficult to implement, but should we not all be doing this anyway? Challenge may be in standardising & becoming systematic. Recommendation 10: Develop a local educational strategy for requesting, drawing on national guidance and literature-based evidence where possible. Establish an ongoing plan to review its relevance. Feedback: Impact 3.3, Importance 3.4, Implementation 2.9 Low scores. Similar issues to recommendation 9 Recommendation 11: Utilise training programmes to address correct sample collection, timing, transport and so on Feedback: Impact 4.1, Importance 3.7, Implementation 4.0 Recognised as important and achievable, especially with order comms systems

8 TK3 Stakeholder feedback (2)
Recommendation 12: Examine the possibility of standardising common profiles in line with national benchmarking data Feedback: Impact 3.8, Importance 3.4, Implementation 2.6 May be more challenging to achieve as common profile become amalgamated into order sets for order comms systems. Leave to Path Harmony to address. Recommendation 13: Consider the potential of implementing symptom-specific profiles. Feedback: Impact 3.4, Importance 3.7, Implementation 3.0 May be challenging as different centres cover different patient populations and utilise a range of IT systems, both within the laboratory, order comms and hospital IT systems. Recommendation 14: Develop a strategy to review all automated interpretative comments to include information on test appropriateness Feedback: Impact 4.0, Importance 3.9, Implementation 2.7 May be doable for specific areas, such as those linked to MRIs, but more challenging in other aspects. Recommendation 15: Consider providing feedback on cost and volume data (and where possible, inappropriate request data), particularly to primary care using comparative data (benchmarking) to utilise peer pressure. Consider primary and secondary care feedback on a longitudinal basis to evaluate trends. Feedback: Impact 3.1, Importance 2.9, Implementation 1.8 No appetite for this, which I find surprising, particularly in the ease of implementation. Provision of cost data should be achievable in most laboratories, either as a summative report or as individual costs.

9 TK3 Stakeholder feedback (3)
Recommendation 16: Consider aligning demand management initiatives to local/national financial incentives and/or penalties for requestors. Feedback: Impact 3.4, Importance 3.4, Implementation 2.7 This is a version of QOF. Informing the guidance such as QOF or local similar initiatives may be important, but difficult to standardise. Recommendation 17: Review specialist tests to determine whether consultant-only and/or specialty-specific requesting would be appropriate Feedback: Impact 3.7, Importance 3.7, Implementation 2.3 Similar challenges to those for recommendation 13. Recommendation 18: Integrate a systematic review of effectiveness into the demand management strategy. Focus on educational approaches that can be monitored and, if necessary, re-emphasised on a regular basis to inform new staff Feedback: Impact 3.4, Importance 4.3, Implementation 2.4 It is key that we move to strategies that can be evaluated and reinforced to ensure ongoing effectiveness. This area may need to be built into the other specific priorities, rather than being a focus in itself Recommendation 19: Develop local best-practise guidance, allied to national data, on appropriate testing. Implement an on-going review to ensure that requestors are aware of the guidance. Feedback: Impact 4.1, Importance 4.4, Implementation 2.6 This may be addressed as a summary of the other priority themes in order to reassure stakeholders about implementation. The Toolkit aim focused on development of a strategic & standardised approach & this workstream may provide the guidance that fulfils this recommendation.

10 TK4 Stakeholder feedback
The right process: ensuring that the correct sample is collected Recommendation 20: Ensure that electronic requesting (where available) includes accurate information on sample requirements, patient identifiers, and criteria for sampling, storage and transport. Regularly review that this is consistent with other local sources of information, such as the laboratory handbook. Feedback: Impact 4.3 Importance 4.6, Implementation 3.9 Highest scoring recommendation overall. Should be achievable using order comms, as per recommendation 11.

11 TK5 Stakeholder feedback
The key: Unlocking the potential of electronic requesting. Recommendation 21: Utilise electronic requesting, wherever possible, to provide information on retest intervals and previous results with a view to preventing unnecessary phlebotomy. Ensure laboratory information systems and electronically requesting retest intervals are consistent Feedback: Impact 4.0, Importance 4.0, Implementation 3.4 Links to MRI project & its implementation. Seen by stakeholders and an important aspect. Recommendation 22: Explore a strategy to utilise electronic requesting as an educational tool. Consider any prerequisites to acceptance of requests Feedback: Impact 4.4, Importance 4.3, Implementation 2.7 Implementation seen as the major hurdle here, but should be achievable in specific focused areas. Recommendation 23: Involve all stakeholders in determining the content and implementation of electronically requesting educational tools. Evaluate requestor attitudes to any changes made. Feedback: Impact 4.1, Importance 4.4, Implementation 3.1 Should be an essential component of implementation of any order comms change. Combine with recommendations Recommendation 24: Explore the potential of electronic requesting in the application of symptom- test sets. Implement a monitoring strategy to ensure appropriate use. Feedback: Impact 3.4, Importance 4.3, Implementation 3.0 Variable response. See recommendation 13.

12 TK6 Stakeholder feedback
Does it make a difference: Assessing the impact of interventions. Recommendation 25: Review the clinical and cost effectiveness of demand management strategies. At a local level, ensure that effectiveness can be regularly monitored. Feedback: Impact 3.0, Importance 3.4, Implementation 2.6 Currently in the research arena Recommendation 26: Integrate assessment of patient views into the laboratory demand management strategy. Consider patient-focused leaflets as a tool to address their concerns. Feedback: Impact 3.9, Importance 3.7, Implementation 2.7 Variable response. There is, however, a need to engage patients and carers in any strategy of this kind. Recommendation 27: Consider the wider impacts of demand management strategies. Explore options to audit effects on clinical outcomes or develop research projects to investigate this. Feedback: Impact 4.3, Importance 3.1, Implementation 2.4

13 Priority themes Implementing systems to ensure correct sample collection (recommendations 11, 20) Maximise the potential of minimum re-test intervals, ideally using order comms (recommendations 20-22) Clearly define key areas/tests (recommendation 1): a. Most successful areas evaluated to date include: MRIs Test vetting Order comms order sets/profiles (eg myeloma screen, menopausal screen) b. Tests might include: Monitoring tests for long-term conditions (HbA1c, TFT, lipids) Commonly mis-used tests (particularly the more expensive/referred tests) Identify a mechanism for collecting key workload data, particularly in those cases listed in 3 above (recommendation 3). This will facilitate the generation of the tests that may warrant vetting (recommendation 6). Linked to this, these data collection mechanisms will also allow the development of a standardised report format on test cost/volume data for feedback to users (recommendation 15). Improve interpretive comments to address issues around test appropriateness (recommendation 14). This will link to the order comms project.

14 Cross-cutting themes Need to be embedded into these priories:
Engagement with: Requestors (see recommendation 23) Patients (see recommendation 26) Laboratory equipment manufacturers and IT system suppliers (order comms, LIMS) (see recommendation 7) Review of Demand Management interventions (see recommendation 18): Effectiveness Sustainability

15 Links to other projects
RCPath Toolkit ACB Trainees’ Committee programme Regional Audit Groups RCPath Minimum Retest Interval project Pathology Harmony project NIHR Research agenda

16 Core components Data extraction Education Order comms Clinical input
IT Systems

Download ppt "RCPath Demand Optimisation Project: update"

Similar presentations

Ads by Google