Presentation is loading. Please wait.

Presentation is loading. Please wait.

Out of Hours Benchmark – Christmas period 2009/10 Example PCT – Example provider November 2010 Henry Clay: 07775 696360

Similar presentations


Presentation on theme: "Out of Hours Benchmark – Christmas period 2009/10 Example PCT – Example provider November 2010 Henry Clay: 07775 696360"— Presentation transcript:

1 Out of Hours Benchmark – Christmas period 2009/10 Example PCT – Example provider November 2010 Henry Clay:

2 © Primary Care Foundation Summary ●This round of the benchmark looks at the Christmas period (14 th December to 10 th January) ●It includes analysis of demand over this busy period, timeliness, priority and clinical productivity. In the next benchmark we will be looking at the summer period and a wider range of measures including cost and patient experience ●This summary is intended to provide an illustration of the sort of feedback that each service receives. In most of the slides we have used the average of all services, but slides 7-10, 12, and include reference refer to a fictitious example service and describe some examples of issues that appear to affect some services ●The following general points apply to many services ●Timeliness – Performance during what is the busiest period of the year for all out of hours providers when most other primary care services are closed for much of the holiday period was usually as good on the ‘difficult days’ of high demand as at other times. Most providers still fall short of the quality requirements for definitive assessment, though overall the first attempt to ring back an urgent case is made in 20 minutes on 88% of occasions ●Priority – We highlight this issue in some providers where the level of cases identified as urgent is either very low or very high or if a significant proportion of less urgent cases subsequently have their priority escalated. We suggest that ensuring consistent assessment of priority is important to minimise the chance of urgent cases being missed ●We suggest that services should also look at variations in productivity (and highlight how the new audit tool in Adastra, the most common out of hours system, supports feeding back information about disposiion and priority to clinicians and call-handlers). We believe that a focus on this area will provide greater consistency and result in improved performance against the quality requirements

3 © Primary Care Foundation These slides cover a number of areas which we have grouped into sections 1.Introduction 2.Demand over the Christmas period ●Demand day by day ●Demand hour by hour ●Variation within an hour ●Demand by age 3.Prioritisation and time to definitive clinical assessment ●Time to definitive assessment ●Variation in performance according to % urgent ●Escalation of priority ●Impact of double assessment ●Performance on the ‘difficult’ bank holidays 4.Time to face to face, disposition and speed of dealing with walk-in cases ●Time to the face to face consultation ●Disposition to advice, base or home visits ●Patients going towards hospital ●Completion of walk-in cases 5.Productivity and NHS Direct ●Productivity across clinicians ●Proportion of NHS Direct cases 6.Specifics and response from the service ●Specific points about your service ●Explanation, correction or clarification from commissioner or provider ●Proposed actions from commissioner and provider

4 © Primary Care Foundation Introduction ●These slides provide our findings from the benchmark of services during the Christmas period from 14 th December 2009 to 10 th January 2010 ●Comparison is made between services of performance in various ways. Where the measure is in line with one of the out of hours quality requirements then the relevant QR number is referred to. ●Where a quality requirement is relevant it is quoted. You should note that full compliance with the quality requirements is defined as 95% of cases or better, partial compliance as 90% to 95%. ●We have set no particular target for areas that are not covered by the quality requirements. Here you may wish to make your own judgment over what is a reasonable performance from a patient or safety aspect ●In the fourth round of the benchmark which will look at the financial year 2010/2011 we will be looking at a wider range of measures including cost and patient perception

5 Demand over the Christmas period

6 © Primary Care Foundation Demand by day is predictable and understood, even for the Christmas period… ●Those involved with out of hours services have long-known the pattern of demand over the Christmas: ●Monday to Friday of the ‘normal weeks’ (clear of bank holidays) are fairly consistent ●Saturday of the ‘normal week’ is usually busier than Sunday, and together they account for over 60% of the demand for the week ●The ‘four-day’ Christmas weekend is usually the busiest of the year– with the Monday bank holiday normally the busiest day ●Christmas day is comparatively quiet, less busy than a normal weekend ●The New Year’s weekend, like any bank holiday weekend. is also busy...though underlying demand may be increased for a period by such events as swine flu

7 © Primary Care Foundation This is reflected in the pictures of demand for all services and for yours

8 © Primary Care Foundation The average pattern of demand by hour during the day is also predictable (red is the weekend, green bank holiday and blue the weekday) Peak from 8 to 12 at weekends and bank holidays Demand, tails off during the evening, but is similar for all types of day Low levels of demand in the ‘red-eye’ shift

9 © Primary Care Foundation The day to day fluctuation for one hour (20.00) in your service closely followed the expected Poisson curve This shows the expected cumulative probability distribution around your average for to in the evening This is the same graph with the red showing the actual distribution for your service based on the 28 days in the sample

10 © Primary Care Foundation The overall split by age follows the expected ‘bath tub’ curve for phone cases, but is comparatively flat for patients that walk in

11 © Primary Care Foundation Looking at the % population for all PCTs compared with % of demand, the higher usage groups can be seen… Greater usage by the elderly Women of child-bearing ages usage higher than men Greater usage by children Note: On this slide (and the next) population by gender and age band is shown as a percentage of the whole and the demand by gender and age band is shown as a percentage of the calls coming in by phone. Male children are less than 7% of the population but represent 13% of demand whilst women over 70 are just over 7% of the population but make up over 15% of demand.

12 © Primary Care Foundation …and you can compare the picture for your PCT (note that the demand includes only those cases received by phone and the profile for the population of the PCT is based on figures from 2008)

13 © Primary Care Foundation General conclusion about demand: It is exactly in line with expectations ●Demand by day follows the pattern that would be expected: ●Monday to Friday of the ‘normal weeks’ (clear of bank holidays) are fairly consistent ●Saturday of the ‘normal week’ is usually busier than Sunday, and together they account for over 60% of the demand for the week ●The ‘four-day’ Christmas weekend is usually the busiest of the year– with the Monday bank holiday normally the busiest day ●Christmas day is comparatively quiet, less busy than a normal weekend ●The New Year’s weekend, like any bank holiday weekend. is also busy ●Average demand during the day follows a predictable pattern ●The number of patients within a particular hour follows the predictable pattern with the random fluctuation that is expected from patients acting independently ●The case mix by age and gender follows that for primary care, with most demand coming from children and the elderly and with women of child- bearing age making greater use of the service than men

14 Prioritisation and time to definitive assessment All of the graphs in this section compare cases that were identified as being received by phone

15 © Primary Care Foundation There was significant variation between services in the speed with which definitive clinical assessment begins

16 © Primary Care Foundation Time to definitive clinical assessment – quality requirement 9 Definitive clinical assessment is an assessment carried out by an appropriately trained and experienced clinician (not a call-handler) on the telephone or face-to-face. The adjective ‘definitive’ has its normal English usage, i.e. ‘having the function of finally deciding or settling; decisive, determinative or conclusive, final’. In practice, it is the assessment which will result either in reassurance and advice, or in a face-to-face consultation (either in a centre or in the patient’s own home).

17 © Primary Care Foundation Many services fell short of QR9 to begin definitive assessment of urgent cases in 20 minutes…. So as to compare services on a like for like basis we take no account of patient attributable delay (for example if the clinician is unable to get through when calling back) which typically accounts for a few percentage points in the green bars above. Where the green is greater than 5% the process almost certainly involves a considerable amount of double assessment

18 © Primary Care Foundation …and considerable numbers fell short of QR9 to begin definitive assessment of less urgent cases in an hour

19 © Primary Care Foundation Explanation of the next slide, testing an expected relationship and looking at how services compare If all other things were equal, except for variation in levels of % urgent on receipt, we might expect it to be easier to definitively assess a high percentage of cases early with a lower % urgent on receipt Expected trend line for services that are similar (‘equally good’ at responding promptly) but with varying % of cases urgent on receipt

20 © Primary Care Foundation Even some of those with low levels of urgent on receipt fall well short of the requirement Those with higher levels of urgent on receipt find it difficult to better 90% definitively assessed in 20 minutes These have low %urgent on receipt but have a low percentage of urgent cases assessed in 20 minutes Conclusion: Whilst the trend is as expected, there are other reasons for significant variations between services

21 © Primary Care Foundation Explanation of the next slide, testing an expected relationship and looking at how services compare If all other things were equal, except for variation in levels of % urgent on receipt, we might expect that those with low levels of urgent on receipt would have a higher percentage of cases escalated by clinicians later in the process than those with high % urgent on receipt Expected curve for services that are similar (‘equally good’ at consistent prioritisation) but with varying % of cases urgent on receipt

22 © Primary Care Foundation The relationship is NOT obvious in the data for different services Conclusion: Not only is there wide variation between services in propensity to define cases as urgent, there is also inconsistency between call-handlers and clinicians (and between individuals) Should those over (say) 6% of less urgent cases that have priority escalated by clinicians review prioritisation? Should services with less than (say) 10% urgent on receipt check potentially urgent cases are not being missed?

23 © Primary Care Foundation Explanation of the next slide, testing an expected relationship and looking at how services compare Double assessment tends to be more frequent when nurses assess calls as they often pass a proportion of cases to doctors (the issue also frequently arises when one service carries out clinical assessment and passes the case on to another). Even in ‘doctor only’ services there is usually a small percentage with two assessment calls (for example if the doctor locks the case, rings a hospital clinician and then rings the patient back to tell them what is planned) This leads to delay in definitive (final) assessment – for example it might take 7 minutes before the nurse calls the patient and 8 minutes to carry out an initial assessment so that the case is not queued for the doctor till 15 minutes has passed. It is therefore easy to miss the 20 minute ‘window’ If all other things were equal, except for variation in levels of double assessment, we might expect it to be easier to definitively assess cases quickly if the level of double assessment Is low Expected curve for services that are similar (‘equally good’ at consistent prioritisation) but with varying % of cases urgent on receipt We are NOT saying that services cannot operate successfully with nurses or more than one provider, but we are highlighting an issue that needs to be managed with care in these cases

24 © Primary Care Foundation The expected relationship is probably there, but there is plenty of ‘noise’ from other factors that affect speed of assessment

25 © Primary Care Foundation It is of particular interest to look at the performance on time to definitive assessment on the two busy bank holidays…

26 © Primary Care Foundation … in contrast to the overall performance in the four week period – most services remain in the same section

27 © Primary Care Foundation General conclusions about prioritisation and time to clinical assessment – services have more to do ●Whilst a small number of services perform consistently well on time to definitive assessment, many still have progress to make to consistently better 85% performance against the measures for QR9 ●Those who do better over the four weeks also performed similarly on the two ‘difficult bank holidays’ ●Some services have so few cases identified as urgent on receipt compared with others that we doubt they can ‘demonstrate that they have a clinically safe and effective system for prioritising calls’ ●There is a need to focus effort on consistency in prioritisation – one would expect priorities to be down-graded at clinical assessment, but in many services too many are escalated ●Double assessment extends the time to decision for the patient and makes it more difficult to meet the quality requirement. Services with more than perhaps 10% of cases of this type should look at strategies to reduce the instances of double assessment

28 © Primary Care Foundation Since demand is predictable, why is performance sometimes poor? We believe that there are two main reasons for this: ●A small number of services still fail to match capacity to the expected demand, but we suspect that the bigger reason is that ●The variation between clinical staff is not closely managed, so that the service performs well when some staff are on duty and less well when others are working. We highlighted typical variation from data collected over six months in one service in the previous benchmark – in this round we have tried (where we can) to look at variation in productivity between clinicians for your service looking at four busy days (see below) Because clinicians want to do a good job, they value comparison with their peers. We strongly support the initiative by some software providers to make it much easier for services to feedback productivity information (from a large sample of cases) to individual clinicians (as part of a rounded audit process that also looks in detail at a smaller sample of cases). There is work to do for services to make best use of these new tools

29 Time to face to face consultation and disposition All of the graphs in this section compare cases that were identified as being received by phone except for the last one (which is clearly labelled)

30 © Primary Care Foundation Time to face to face consultation – quality requirement 12 Face-to-face consultations (whether in a centre or in the patient’s place of residence) must be started within the following timescales, after the definitive clinical assessment has been completed: ● Emergency: Within 1 hour. ● Urgent: Within 2 hours. ● Less urgent: Within 6 hours.

31 © Primary Care Foundation >40% of services are partially compliant with QR12 for urgent cases and >70% achieve better than 85%....

32 © Primary Care Foundation …whilst ~75% are fully compliant with the standard for less urgent cases (and almost all at least partially compliant)

33 © Primary Care Foundation There is a very wide variation between services in the disposition of cases into advice, base or home visits…

34 © Primary Care Foundation …but inconsistent recording of outcome by clinical staff means that we cannot rely on the analysis of the proportion going to hospital Note: We have previously estimated that, where recording is reliable, about 12 to 15% of patients typically find their way to hospital after contacting their OOH service. Whilst some services will have lower figures than this, many of those that appear towards the lower end are not recording such instances reliably

35 © Primary Care Foundation Explanation of the next slide Increasingly services are becoming more integrated with out of hours clinical staff seeing patients who came to the walk-in centre, the emergency department or an urgent care service There is still considerable variation between models, some with a much higher proportion of walk-in cases than others (up to 50%) In the future (if it is useful to users) we could look at a wider range of measures (providing comparable information can be obtained) We have chosen just to look at the proportion of walk-in cases that are completed in two hours

36 © Primary Care Foundation The majority of services have discharged over 80% of walk-in cases within two hours of the start of the episode Note: We focus here on the patient’s journey, so if the patient arrives at A&E and is referred from their to the OOH service we are looking to ‘start the clock’ when they first arrived at A&E. In some cases this data has not been available so, if the number of such cases is a significant proportion of the walk-in total, we have omitted them from the graph. Note also that if a patient chooses to leave before the consultation then it is still included in the divisor

37 © Primary Care Foundation General conclusions about time to face to face, dispositions and walk-in patients – services have to meet a variety of needs and do so in different ways ●Over half of services fail to meet QR12 for seeing urgent cases in two hours (there may be a need for clinicians to check that patients have understood the importance of doing this)… ●…but the performance for less urgent in six hours is much better ●There is a wide variation between services in the proportions of advice, base and home visits. Some of this is shaped by the commissioner, but we believe that most comes from the historic way that OOH services worked in each area ●There appear to be great variations in the proportion of patients going to hospital – but inconsistent or unreliable recording of outcomes by clinical staff in some services exaggerates the spread ●Some services deal with significant numbers of walk-in patients. In some the response is much slower than in others

38 Productivity and NHS Direct We have previously shown the variation in productivity between services. This time the comparison shows (where the data allows) the variation between individuals in a service. We have also included information about cases transferred from NHS Direct highlighting the different proportions of the OOH case- load

39 © Primary Care Foundation Productivity – the approach Previously we looked at productivity by comparing the average number of cases each hour with clinical staffing levels. In this round we have chosen to look at it differently (as an experiment): ●To choose a measure that is easier for users to relate to ●To recognise the variation in proportion of advice, base and home visits ●To highlight the wide variation between individuals The approach involves attributing standard minute values of: 6 minutes for telephone advice/assessment 12 minutes for a face to face consultation in the base 30 minutes for a home visit The SMV is not counted for telephone advice/assessment if the case is put back into the queue for a further call (so in the case of double assessment only 6 minutes SMV is ‘earned’ at the assessment stage) You should not expect better than (say) 40 minutes to be earned per hour on average as, if your staffing is correct to provide a responsive service, there will sometimes be gaps between patients For a more valuable comparison of all clinical staff services should, of course, be looking at a much longer period

40 © Primary Care Foundation There are sometimes good reasons for variations between individuals – but if this is too wide or not understood the service may suffer ●Clinicians may be assigned to cases that take much longer (sectioning a patient, full assessment of patients admitted to a community hospital etc) ●One clinician may deliberately select the more difficult cases (or be left them) to support less experienced staff ●A clinical lead may have other responsibilities for overseeing parts of the operation and ‘dip in’ to support others when the service gets behind ●At certain times of the day there will be insufficient work to keep clinical staff busy (less applicable during the busy period chosen for comparison) ●Nurses may follow a systematic process (using decision support software) that takes longer than assessment by a doctor not using the tool ●Some clinicians are more comfortable than others with telephone assessment ●…and you can think of more On the next slide we have chosen to provide an illustration of how to read the spreadsheet (this is NOT your data, but just demonstrates how it works)

41 © Primary Care Foundation These slides describe how to ‘read’ the productivity spreadsheet sent to providers (at validation stage) Clinician identifier Number of phone consultations by hour for each clinician Number of base consultations by hour Number of home visits by hour Number of contacts by hour Weighted SMV ‘earned’ per hour This is NOT your data – it is illustrative only

42 © Primary Care Foundation In the example above the colours are to highlight variation in performance as measured by standard minute values earned: ●Both of the orange clinicians only carry out phone consultations/advice. They average 2.5/2.6 case an hour and earn 15 minutes per hour on average ●The red clinician also only carries out phone consultation/advice but averages 13.4 cases an hour and earns 80 minutes per hour on average ●Particularly if this variation persisted over time - would you be happy with it? Would you be worried about whether the red clinician was devoting enough time to ensure understanding of any advice given? Would the service have kept up with demand if the red clinician had not shouldered the bulk of the advice workload? ●The yellow clinician is seeing patients only in the base at 3 cases an hour earns 36 minutes per hour whilst the blue clinician sees 3.3 cases an hour in the base, but also manages to carry out some telephone assessment and earns 47 minutes per hour on average A formal process for comparison and feedback to individuals addresses these sorts of issues – hence our highlighting the new tools available to support this in slide 26 above.

43 © Primary Care Foundation This compares the SMV for the average clinician between services (this compares providers where we had information about all clinical consultations. Your service is not included because we do not have the details of all clinical consultations carried out ) Note: Because the graph looks at four days, the total graph above is measured out of 4*60=240 minutes. Colours are chosen to match those used in the feedback for validation spreadsheet.

44 © Primary Care Foundation The proportion of OOH cases transferred from NHS D averages close to 10% - but there is a wide spread

45 © Primary Care Foundation We noted that some services change the priority from NHS D – down-grading a significant proportion of cases before assessment

46 © Primary Care Foundation General conclusions about Productivity and cases transferred from NHS Direct ●We have previously highlighted the variation in productivity between services ●The figures provided (where we could) as part of the feedback for validation this time show the variation between individuals which, of course, is much wider ●The purpose is to illustrate (to those that are not already doing it) an approach to measuring productivity to be fed back to clinicians as part of the regular audit – the recently developed Adastra audit software supports this process ●We thought it useful to include some information about NHS Direct cases, particularly as it may highlight issues to be considered for the new 111 number ●There are wide variations in the proportion of the workload that comes from NHS D – and we suspect that there are a number of factors that cause this – for example the extent to which some population or age groups may prefer to access advice by phone and the amount of local promotion of the NHS D service ●We know that some OOH services have long argued that NHS D priorities are not aligned with the time horizons of the quality requirements so the proportion of urgent and emergency cases that they pass to OOH services is too high (an issue that we too have raised with NHS D on several occasions) ●Some providers either down-grade the NHS D priority through the mapping between systems (or, occasionally, by reviewing the cases before clinical assessment over the phone). Were harm to occur to a patient because of delay the service would be exposed if the priority had been down-graded

47 Specifics and response from commissioners and providers This section includes specific comments from the detailed analysis and provide an opportunity for commissioner and provider to describe any actions taken or planned

48 © Primary Care Foundation Objectives for this section ●In the next round of the benchmark results will be available in a form that allows services to see ‘who is who’ ●Particularly then, but even now, it was suggested by users that it was important to allow services to explain specific points and describe actions already taken or planned ●This section is to allow commissioners and providers this opportunity ●To provide some consistency in the format we have set up two slides, one to provide any explanations and one to summarise actions taken or planned – but you are welcome to add further slides to provide more detail about this summary information ●To channel your thoughts we have also listed areas that we would expect to be covered and included any more detailed points we noted during the analysis

49 © Primary Care Foundation Areas for attention for the example service (not being involved in the service these suggestions are made without knowledge of actions that you have already taken) ●The service falls short on time to definitive assessment both for urgent and less urgent cases – the next slide shows that it is largely at weekends that it gets behind ●You have a near-average level of urgent on receipt yet a considerable number of cases identified as less urgent on receipt subsequently have their priority escalated. You should perhaps look to review prioritisation with a view of getting greater consistency and minimising the chance the urgent cases are missed ●We suspect that the record of patients going to hospital is incomplete (we normally provide some illustrative case numbers) – to provide good information in the future it would be good to continue to remind clinicians about consistent use of outcome codes ●Walk in cases are a comparatively small part of the case-load – but some wait a long time so that QR 10 and 12 are missed ●Working on productivity to deliver greater consistency is probably an important part of supporting the improvement of time to definitive assessment and the new audit tools from Adastra should facilitate feedback of this information to clinicians and call- handlers ●The service appears to stream some calls direct to the centre without clinical assessment and we have included a reminder about the importance of robust processes and PCT approval below

50 © Primary Care Foundation Specific detailed points for example service – it is at the weekend and bank holidays that the service gets behind with assessment At weekends and bank holidays significant numbers of cases are above the green and are taking more than an hour to definitive assessment This is the picture by hour for the weekends – the service gets behind in the busy morning period and takes a long time to catch up

51 © Primary Care Foundation Specific detailed points for example service – whilst walk-in is a small part of the workload some wait for a considerable period On weekends in particular the proportion at the top in light blue or orange (2 to 4 hours and over 4 hours time to completion) becomes significant 44% of walk in cases wait for more than an hour for clinical assessment

52 © Primary Care Foundation Specific detailed points for example service – we don’t know what the protocols are for call- streaming, but they need to be very robust About 19% of phone cases appear to be streamed and we are not sure if this process is consistently followed for a narrow group of patients/cases or introduced at the busier times. You need to be certain that no potentially urgent cases are missed and to audit the process We strongly support the concept of streaming some patients direct to the centre without a telephone assessment – but only if the protocols are robust so that there is no danger of delay in any potentially urgent case and only with good training and disciplines and with formal approval of the process at PCT board level

53 © Primary Care Foundation Explanations – this is for you to provide explanation about specific measures/slides Reported measure Slide number(s) Explanation

54 © Primary Care Foundation Actions taken or planned – this is for you to provide details about actions already taken or planned (copy if necessary and add further slides) Action plannedPlanned completion Explanation


Download ppt "Out of Hours Benchmark – Christmas period 2009/10 Example PCT – Example provider November 2010 Henry Clay: 07775 696360"

Similar presentations


Ads by Google