National Cancer Peer Review Programme Ruth Bridgeman National Programme Director Julia Hill Acting Deputy National Co-ordinator.

Slides:



Advertisements
Similar presentations
South West Specialised Commissioning Group Selena Blake – Senior Commissioning Manager / TYA Programme Manager South West Specialised Commissioning Group.
Advertisements

Adult HIV Outpatient PBR Tariff Development National Reference Group Meeting 21/05/10.
Peer Review of Cancer MDTs Presentation to Gynae Regional Group, 22nd May 2009.
Head teacher Performance Management
Performance management guidance
Auditing, Assurance and Governance in Local Government
Highly Specialised Technologies Evaluations
Trust Cancer Lead Clinician
The LCA: Implementing a Quality Assurance and Informatics Strategy to Enhance Cancer Care Dr Shelley Dolan LCA Clinical Director.
A Framework of Quality Assurance (FQA) for Responsible Officers and Revalidation Ahead of the Curve RO conference 4 June 2014.
Standards for Better Health implementation Suzie Loader Director of Nursing.
1 Faculty of Public Health Continuing Professional Development Scheme.
PREPARING FOR REVALIDATION. Licences issued Revalidation pilots ongoing to test the whole process – completion March 2011 Responsible Officers – to be.
Cancer Summit Plymouth Hospitals NHS Trust 12 th February 2015 Ruth Bridgeman - Programme Director, National Peer Review Programme.
Children and Young People Improving Outcomes Guidance  Key aims of guidance and age specific requirements  Designation of Principal Treatment Centres.
Disability and special educational needs: local area responsibilities under the Children and Families Act, 2014 Charlie Henry HMI National lead for disability.
Advice to Support commissioners to achieve improvements in cancer services Version 6 Please note this toolkit is a draft version and any comments / recommendations.
Promoting Excellence in Family Medicine Enabling Patients to Access Electronic Health Records Guidance for Health Professionals.
Westminster City Council and Westminster Primary Care Trust Voluntary Sector Funding 2009/10 Voluntary Sector Funding Eligibility, Application Form Funding,
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
APPRAISAL OF THE HEADTEACHER GOVERNORS’ BRIEFING
PGCE Full-time SE3 Briefing March Aims Be aware of the expectations of SE3 Understand what is expected of you during the block experience Understand.
West London Mental Health NHS Trust CQC Action Plan Response to Recommendations Nigel McCorkell - Chairman Peter Cubbon – Chief Executive Ian Kent – Deputy.
National Cancer Peer Review Programme Louise Wilson Quality Manager North Zone.
South West Specialised Commissioning Group Selena Blake - Programme Manager Teenage and Young Adults Cancer Services South West Specialised Commissioning.
Needs Assessment: Young People’s Drug and Alcohol Services in Edinburgh City EADP Children, Young People and Families Network Event 7 th March 2012 Joanne.
February 28 th 2012 The Changing Face of Revalidation Ian Starke, Medical Director, Revalidation, Royal College of Physicians, London.
Assessment for improvement [Name] [Title] [Date / Event] V4.5.
NAVCA Quality Award Andrea Allez Performance Improvement Manager Excellent service for local groups.
JOINT STRATEGIC NEEDS ASSESSMENT Rebecca Cohen Policy Specialist, Chief Executive’s.
National Support Team: Findings from the first 2 years Katrina Stephens Associate Delivery Manager, Alcohol Harm Reduction National Support Team, Department.
Establishing a baseline of the seven day services clinical standards in acute care ‘A how to guide’ To activate the links in this slide set please view.
Commissioner Feedback for SLAM CQC Inspection in September 2015 Engagement with Member Practices 1.
What do all GPs need to know About revalidation and commissioning Autumn 2012.
UK Wide Core Skills & Training Framework Findings of 2 nd Stage Consultation and Implications for Development of the Framework.
Commissioning Self Analysis and Planning Exercise activity sheets.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
London Specialised Commissioning Group 10 th September 2009 Major Trauma Services for London Commissioning and Finance Arrangements Sean Overett Divisional.
On-line briefing for Program Directors and Staff 1.
BTEC Quality Assurance BTEC Foundation Level – Level 3 including Workskills The quality assurance includes three processes: Lead Internal Verifier.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
The Health Roundtable Improving data collection rates, while improving quality Presenter: Sandra Avery Liverpool Innovation Poster Session HRT1215 – Innovation.
APPRAISAL OF THE HEADTEACHER GOVERNORS’ BRIEFING.
Cancer Outcomes and Services Dataset Linda Wintersgill Information & Audit Manager, NECN.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Compliance Monitoring and Enforcement Audit Program - The Audit Process.
Monitoring the National Cancer Standards for dummies…… ……….by dummies Louise Carrington. Programme Co-ordinator CSCG Andrew Graham. HSW.
National Cancer Survivorship Initiative 2010 Update.
The management of low-risk basal cell carcinomas in the community Implementing NICE guidance in general practice May 2010 NICE guidance on cancer services.
Peer Reviewer Major Trauma Network Training. Welcome and Introductions.
Registration and monitoring compliance Michele Golden Compliance Manager 2 November 2010.
Developing a national governance framework for health promotion in Scottish hospitals Lorna Smith Senior Health Improvement Programme Officer NHS Health.
Training for organisations participating in Peer Review of Paediatric Diabetes.
National Peer Review Programme Specialist Palliative Care Ruth Bridgeman / Julia Hill Programme Director/ Deputy Programme Director.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Peer Review for Paediatric Diabetes Ruth Bridgeman.
Reviewer Training xxx zone. Welcome and introductions.
What is revalidation? Every three years, at the point of your renewal of registration, you need to show that, as a professional, you are living by the.
The Quality Surveillance Team / Programme
Embedding the golden threads that lead to quality care every time……
Taught Postgraduate Program Review
Middle States Update to President’s Cabinet October 8, 2018
Making MDTs better Steve Falk
Chemotherapy Services in England: Ensuring quality and safety
Clinical Audit Summary Guide
Impact of 2019 Sarcoma Service specification for Bristol
What is revalidation? Every three years, at the point of your renewal of registration, you need to show that, as a professional, you are living by the.
Taught Postgraduate Program Review
Role of the Internal Verifier
Presentation transcript:

National Cancer Peer Review Programme Ruth Bridgeman National Programme Director Julia Hill Acting Deputy National Co-ordinator

Welcome and Introductions

Aims of Today To promote an understanding of the Revised National Cancer Peer Review Process To enable lead teams within trusts and networks to implement the programme with confidence To provide clarity on the requirements for supplementary evidence To enable lead teams within trusts and networks to cascade to others within their organisations

Learning Outcomes Understanding of the revised process and the national schedule Understand the self-assessment requirements Understand the Clinical Lines of Enquiry Understand how to validate self-assessments Be familiar with CQuINS Understand the external verification and peer review visit processes Have knowledge of the outcomes from NCPR Programme Have confidence to train others in their organisations and know where to access support and advice

Session 1 Introduction to the Revised Process

The New Healthcare Environment

Ensuring Effective Levers Ensuring Peer Review outcomes are fed into the Care Quality Commission legal registration requirements Embedding Peer Review outcomes into the commissioning process Providing evidence that services are meeting the NICE Quality Standards

Reducing the Burden of Peer Review on the NHS The key actions are: Reducing the measures - To further reduce the number of measures within the manual for cancer services by 10%. Amalgamate Reports - Where possible amalgamate measures to reduce the number of reports required i.e. locality and MDT measures. Biennial Submission of Evidence - evidence for the annual SA should be submitted biennially, teams/services should instead complete a commentary in relation to the key questions each year along with the SA against compliance with the measures. The exception to this would be teams performing below 50% compliance or with unresolved immediate risks.

Reducing the Burden of Peer Review on the NHS Biennial Internal Validation - A tumour site will be assigned either an odd or even year so like teams are verified in the same year. The IV panel will be required to endorse the teams/services self assessment compliance and commentary. The exception to this would be teams performing below 50% compliance or with unresolved immediate risks. Further Clarify Supporting Evidence Requirements - Provide training in the evidence required for self assessment and review visits and provide example materials for those who are responsible for the completion of the evidence.

Reducing the Burden of Peer Review on the NHS Withdrawal of Earned Autonomy (EA) – This is no longer required as all teams/services will now be IV biennially rather than annually. Amnesty in Teams performing at 85% or above and without IR or SC will not be required to SA in 2011, unless in the IV cycle or identified for a peer review visit. Targeted Peer Review Visits – Visits will only be undertaken where a team/service: –Falls into the risk criteria –Where there is considered to be an opportunity for significant learning –As part of a small stratified random sample to assure public confidence in SA and IV.

Implications of the Recommendations The peer review visit programme will continue with the comprehensive review of Children’s services as planned in 2011/2012. The schedule for Peer Review will be revised to move TYA, Acute Oncology and Chemotherapy services into the IV cycle for 2011/2012 rather than comprehensive visits. A number of national events will be held to explain the operational detail of the NCPR programme and provide examples of the evidence required for the various stages of peer review programme. The handbook for the NCPR Programme will be revised and published by the end of March 2011 in order for the operational details to be identified and circulated. e.g. which topics would be subject to IV each year.

Peer Review Process

What is Cancer Peer Review? A quality assurance process for cancer services. An integral part of Improving Outcomes – A Strategy for Cancer Assesses compliance against IOG for NHS patients in England. A driver for service development and quality improvement Supported by a set of measures

Aims of Cancer Peer Review To ensure services are as safe as possibleTo improve the quality and effectiveness of careTo improve the patient and carer experienceTo undertake independent, fair reviews of services To provide development and learning for all involved To encourage the dissemination of good practice

Outcomes of Peer Review Confirmation of the quality of cancer services; Speedy identification of major shortcomings in the quality of cancer services where they occur so that rectification can take place; Published reports that provide accessible public information about the quality of cancer services; Timely information for local commissioning as well as for specialised commissioners in the designation of cancer services; Validated information which is available to other stakeholders

The Peer Review Programme Peer Review Visits Targeted External Verification of Self Assessments- A sample each year Internal Validation of Self Assessments Every other year (Half of the topics covered each year) Annual Self Assessment All teams/services

The National Schedule

Manual for Cancer Services Policy Documents, measures & Cancer Peer Review

Measures Development Developed by an expert group Aimed to measure areas detailed in the National documentation e.g. NICE Improving Outcomes Guidance and National reports such as NCAG and NRAG reports. 3 month consultation on new measures

The commissioning of services Inter-professional communication Co-ordination of care User Involvement User/carer experience Information Access to services Focus for the Measures

Characteristics of the Measures Objective Specific Discriminating Clear and unambiguous Developmental Clear about who is responsible Measurable Verifiable Achievable

Consultation Process Measures published on DH website and on CQuINS Proforma for comments Consultation Events –Brain & CNS and Sarcoma 17 th March London Holiday Inn Bloomsbury 29 th March Leeds Queens Hotel –TYA 21 st March London Holiday Inn Bloomsbury 28 th March Leeds Queens Hotel

Consultation Process All comments collated and considered Panel / Editing Meeting Final Publication

New Measures

Session 2 Self Assessment

The Self Assessment Process Quality Measures Evidence Documents SA Report

Self Assessment Report Forms part of the self assessment Short summary report completed by the lead clinician Commentary that reflects the level of compliance with the measures, patient experience and clinical outcomes. Includes development and achievements over the past year.

Self Assessment Report – Key Themes Structure and FunctionCo-ordination of Care/PathwaysPatient experienceClinical Outcomes/Indicators

MDT Key Themes

Structure and Function This can be demonstrated through compliance to: any measures that relate to MDT leadership, membership, attendance and meeting arrangements; any measures within the operational policies section regarding patients which are reviewed by the MDT; % time MDT core members devote to this cancer type; training requirements of MDT members; responsibilities of nurse MDT members ; MDT workload data and surgical workload data.

Structure and Function This section of the report requires specific answers to: Are all the key core members in place? Does the MDT have a clinical nurse specialist? What is the compliance with waiting time standards? How many patients by equality characteristic( race, age and gender) were diagnosed /treated in the previous year?

Coordination of Care/ Patient Pathways This can be demonstrated through compliance to any measures that relate to the existence of a coordinated and patient centred pathway of care, for example; any measures relating to agreement of network guidelines and patient pathways; recording of treatment planning decisions; key worker and principal clinician policies; communication with GPs.

Patient Experience This relates to the collection of information on and achievement of improvements to service delivery, patient experience and gaining feedback on patients’ experience. It may include information associated with: enhanced recovery programmes; communication with and information for patients; other patient support initiatives; service improvement initiatives such as process mapping and capacity and demand analysis; information from the National Cancer Patient Experience Survey; It is important to demonstrate any measurable change in performance regarding these parameters, compared to previous assessments.

Patient Experience This section of the report requires specific answers to: What are the national patient experience survey results / local patient experience exercise feedback results?

Clinical Outcomes/ Indicators Where available the data from the clinical indicators should be used. You should comment separately on each indicator. Where national clinical indicators for the team’s cancer site have not yet been agreed for the peer review please identify and comment on the top five clinical priority issues for your team. It is important to demonstrate any measurable change in performance regarding these parameters, compared to previous assessments. Relevant measures include any relating to data collection, relevant network audits and research activity.

Clinical Outcomes/ Indicators This section of the report requires specific answers to: What are the major resection rates? What are the mortality rates within 30 days of treatment? What is your recruitment to trials? Outcomes of any key audits projects?

Network Site Specific Group Key Themes

Structure and Function This can be demonstrated through compliance to any measures that relate to NSSG measures on: membership; terms of reference and generic NSSG functions; configuration of all that site’s MDTs in the network; the relationship of MDTs and NSSGs with those in other networks comprising a site specific supranetwork arrangement. In addition, there should be a particular focus in the report on any need for compliance with ground rules for networking and progress towards implementing any IOG recommendations on network configuration issues.

Coordination of Care/ Patient Pathways This can be demonstrated through compliance to any measures that relate to NSSG or site specific network measures involving guidelines and patient pathways across networks and supranetworks. This includes: guidelines and pathways between the MDTs of the site specialty under review; guidelines and pathways regarding shared care between MDTs and other parts of the network infrastructure, such as children’s, TYA and late effects MDTs, MDTs of other site specialties, specialist palliative care MDTs and ‘cross cutting’ groups; information from related initiatives not covered by the measures; the range of performance of the MDTs (with special focus on outliers).

Patient Experience This relates to the collection of information on and achievement of improvements to service delivery, patient experience and gaining feedback on patients’ experience. It may include information associated with: enhanced recovery programmes; communication with and information for patients; other patient support initiatives; service improvement initiatives such as process mapping and capacity and demand analysis; information from the National Cancer Patient Experience Survey; It is important to demonstrate any measurable change in performance regarding these parameters, compared to previous assessments.

Patient Experience This section of the report requires specific answers to: What are the national patient experience survey results / local patient experience exercise feedback results?

Clinical Outcomes/ Indicators The section should comment on the range of performance of the MDTs regarding their Clinical Indicators / Clinical Outcomes. There should be special focus on; outliers in the network and the relative performance of the MDTs and the network in relation to the national range; results of some network audit projects; clinical research, and the range of performance of the MDTs (with special focus on outliers). Where national clinical indicators for the team’s cancer site have not yet been agreed for the peer review please comment on the top five clinical priority issues identified by the MDTs in the network and the MDTs’. performance regarding these.

Self Assessment Report Will be a public document Will form basis of Annual Peer Review Report for those teams not subject to internal validation Handbook contains guidance on identifying Immediate Risks, Serious Concerns and Concerns

MDT- Evidence Documents (only required every other year) Operational Policy Annual Report Work Programme Describing how the team functions and how care is delivered across the patient pathway Outlining policies/processes that govern safe / high quality care Agreement to and demonstration of the clinical guidelines and treatment protocols for team. Summary assessment of achievements & challenges Demonstration that the team is using available information (including data) to assess its own service -MDT Workload & Activity Data (activity by modality, surgical workload by surgeon, numbers discussed at MDT, MDT attendance) -National Audits -Local Audits -Patient Feedback -Trial Recruitment -Work Programme Update How the team is planning to address weaknesses and further develop its service. Outline of the teams plans for service improvement & development over the coming year -Audit Programme -Patient feedback -Trial Recruitment -Actions from Previous reviews

Network Group Evidence Documents (only required every other year) Constitution Annual ReportWork Programme The Groups terms of reference including a description of how group is constituted and how it functions Description of process of how Network Group links to individual MDTs within Network The current clinical and referral guidelines agreed by the group The agreed structure and scope of the service delivered across the Network. Summary assessment of achievements & challenges Demonstration that the group is using available information (including data) to assess Network services Summary of patient feedback and audit data Summary update on implementation of previous year’s work-programme (including progress on implementing actions from previous reviews) How the group is planning to address weaknesses and further develop Network services Outline of the groups plans for Network wide service improvement & development over the coming year Should include addressing actions from previous peer reviews where relevant

Demonstrating Agreement Where agreement to guidelines and policies is required there should be a statement on the front cover of the document indicating the groups and individuals that have agreed the document and the date of agreement. Evidence Guides will indicate the groups and individuals that need to be documented as agreeing the key evidence documents.

Evidence Guides Guidance to help you structure your evidence documents Guidance for ComplianceAdditional Guidance Always refer to the full measure in making assessments against measures

Clinical Indicators/Outcomes

Increasing focus on addressing key clinical issues and clinical outcomes Clinical indicators developed in conjunction with SSCRGs and relevant tumour specific national bodies. Development of Clinical Indicators

Rationale –Increased range of possible diagnostic and treatment interventions –Subsequent guidance issued by NICE incorporated into peer review discussions –Supporting the overall aims of Improving Outcomes- A Strategy for Cancer

Principles of Clinical Indicators The data should available nationally or readily available locally. Not intended to require further audit in themselves Metrics which can be used as a lever for change and for reflection on clinical practice and outcomes They may be lines of enquiry around clinical practice, or around collection of data items, rather than enquiry focused on the data itself May cover key stages along the patient pathway, including diagnosis, treatment and follow up There should be some consensus on national benchmarking data which can be used to inform the discussions

Development of Clinical Lines of Enquiry Clinical Indicators Data in relation to the indicators – National/Local Clinical Lines of Enquiry – Briefing sheet identifying the questions reviewers will ask in relation to the clinical indicators based on the data

Clinical Lines of Enquiry Conclusions from clinical discussions with review teams will be supportive in –Highlighting significant progress and/or good clinical practice –Identifying challenges faced in providing a clinically effective service –Identifying areas where a team/service may require support/development to maximise its clinical effectiveness

Key clinical issues will be highlighted through discussion and review of existing evidence and information Not intended to identify IR or SC Clinical Indicators

Progress to Date Progress to date –Pilot with Lung and Breast almost complete – feedback positive –CLEs developed in Upper GI, Gynaecology, Colorectal and Head & Neck for implementation 2011 – 2012 reviews –CLEs to be developed for Sarcoma, Brain and CNS, Skin and Urology

Lung Clinical Lines of Enquiry Key headline indicators –The % of expected cases on whom data is recorded –The % histological confirmation rate –The % having active treatment –The % undergoing surgical resection (all cases excluding mesothelioma) –% small cell receiving chemotherapy

Breast Clinical Lines of Enquiry Key headline indicators – National Data –Percentage of women offered access to immediate reconstruction surgery by MDT or by referral onto another team and rate of uptake –Ratio of mastectomy to Breast Conserving Surgery (BCS) –Each surgeon managing at least 30 new cases per year –Average length of stay for breast cancer with any surgical procedure –The one-, two- and five-year survival rates Key headline indicators – Local Data –Proportion of women tested for HER2 prior to commencement of drug treatment (if undergoing resectional surgery and receiving adjuvant or neo-adjuvant chemotherapy) –Availability of Screening and estimated impact on workload of extended Programme –Availability of Digital mammography

Preliminary Feedback The focus of discussion moved from structure and process to more clinically relevant issues Many teams have used the figures as the basis for audits on their practice to understand why they are outliers Highlighted issues with completeness of data collection, the process for clinical validation and whether outcomes are regularly reviewed and acted upon by the MDT Driven the impetus for clinical teams to work with the trusts to address the infrastructures to support data collection

Session 3 Internal Validation of Self- Assessments

Internal Validation – The Purpose to ensure accountability for the self assessment within organisations and to provide a level of internal assurance to develop a process whereby internal governance rather than external peer review is the catalyst for change to confirm that, to the best of the organisation’s knowledge, the assessments are accurate and therefore fit for publication and sharing with stakeholders to identify and share areas of good practice

Who Validates? ServiceResponsibility for Validation MDTHost Trust Cross Cutting ServiceHost Trust Locality GroupHost Trust NSSGHost Network Management Team Network Cross Cutting GroupHost Network Management Team

Internal Validation – What we Expect the process is agreed within the organisation the process adopted has agreement with the commissioners within the locality and the cancer network accountability for the self assessments is confirmed by agreement of the chief executive of the organisation there is commissioner and patient / carer involvement within the process the process and outcome of the validation is reported on the nationally agreed proforma.

Internal Validation – Suggested Approaches Desk-Top Review Small panel review and validate assessment Panel Review Small panel review assessment Meet with representatives of the MDT/NSSG to discuss key issues and finalise validation

Internal Validation – The Process Agreed Validation Process takes place Further clarification may be sought on some issues / opportunity of re-submission of specific evidence Validation report agreedValidated compliance recorded on CQuINSValidation report uploaded

Advice on Involving Patient/Carers They should be nominated by either the Network or Locality User Group They should not be involved in validation of an MDT that has provided their care or treatment They should not be in current treatment and be at least 2 years post initial diagnosis/treatment They must be supported in clearly understanding what is being asked of them

The Internal Validation Report Will be a public document Will form basis of Annual Peer Review Report for those teams not subject to external review Handbook contains guidance on identifying Immediate Risks, Serious Concerns and Concerns

Session 5 CQuINS

Using CQuINS V4 Using CQuINS V4 Available via the web site at: Secure web based database supporting each stage of the cancer peer review process Records assessments, compliance with the measures and reports Provides information for national analysis and reporting

Completing the Self Assessment

1.Upload Key Documents - (Alternate years only) 2.Enter Compliance on CQuINS 3.Complete Team Report

Completing the Self Assessment 1 2

1 Upload Key Documents 1 2 3

Enter Compliance

1 23 4

Complete Overview Report

Session 6 Evidence Requirements

Self Assessment Key Documents -teams/services should ensure the evidence requirement stated for each measure is included either in one of the key documents i.e. operational policy, annual report, work programme or if not in one of these key documents it should be included as an appendix. Additional Evidence -If the actual evidence is not included in the upload documents on CQuINS then the team should include a statement which makes clear this evidence requirement has been checked by the team/service and would be available if a peer review team were to visit. Use of Internet Hyper-links - it is acceptable for teams/services to include internet hyperlinks but these links must have open access and not be on the closed section of the trust or organisation intranet system.

Internal Validation Key Documents - Ensure all the evidence required against the measures for a team/service has been checked and is available on the CQuINS database via the key documents. Additional Evidence - If any evidence is not available on the CQuINS system, the internal validation panel should confirm they have seen the evidence or give details of the spot checks they have undertaken. Confirmation - This should be made clear on the internal validation report form. It is not sufficient to give an overall statement that all evidence has been seen. Details of the specific evidence seen against measures should be identified and noted on the compliance spreadsheet.

Peer Review Visit Key Documents - A full copy of all evidence uploaded onto CQuINS must be available to reviewers on the peer review visit. This can be either hard copy or electronic. Patient Records - Peer Review zonal teams will normally request 5 sets of patient notes in order to check compliance against the measures. Teams may sometimes require more than 5 set of patient notes but this should never exceed 10. Only clinical NHS staff will review patient notes.

General Principles Personal details / Patient information It is essential that no identifiable patient data including hospital number should be uploaded on the CQuINS database. The personal details of individual staff in a team/service should not be uploaded e.g. certificates or job plans. Identification of individuals should not be made on reports uploaded onto CQuINS. Reports should refer to the roles they carry out.

General Principles Agreements The role of the person indicated on the agreement should include any delegated role they are undertaking for others. The front cover of any document uploaded should show the date, version and planned review date.

General Principles Configuration of the Network The configuration of the network is essential to the review of a particular tumour site and ensuring compliance against the Improving Outcomes Guidance. Details of PCT referral pathway and populations are essential. Membership When a measure asks for the membership of a group then the name, role and organisation the individual represents should be indicated on the evidence.

General Principles Patient Information Does not require uploading on CQuINS Copies available for IV panel and Peer Review Team The IV report should confirm that the patient information has been seen and that it covers all the essential elements of the measure. At self assessment the team/service should list the patient information they have in the key documents uploaded on CQuINS. Patient Experience Exercise A summary of the exercise including the key points and action implemented is sufficient in the key documents. A copy of the patient exercise should be seen available for both peer review and IV IV assessment should confirm this has been seen. The national cancer patient survey would be acceptable for this measure.

General Principles Key Worker The key documents should confirm that details of the key worker including contact number are given to each patient. IV should confirm the details of the key worker can be found in patient notes by completing a spot check. Peer review teams will spot check patient notes.

Specific Evidence Requirements Working practice of a team/Spot checks Where measures ask for reviewers to ask about working practice of teams/services or to undertake spot checks, they will do this when on a review. IV should mirror this and include comments in the IV report. For self assessment teams/services should state that they have completed a spot check and the results of the spot check or give details of the working practice. Annual Meetings It is only necessary to make a statement in the key documents to confirm the time/date of the meeting and that a record has been made. IV should confirm this meeting has taken place. If it is unclear that a meeting has taken place reviewers on a peer review visit may ask for minutes of the meeting.

Specific Evidence Requirements Attendance records /Meeting dates This can often be satisfied by one clear piece of evidence showing: Dates of the meetings Name, role and organisation represented of those who have attended each meeting The SA report form should comment about any roles not covered or attending appropriately. Any summaries of attendance should demonstrate individual attendance at each meeting for all members as well as the summary. Policies /Guidelines/Plans The date and version should be shown on all policies/guidelines and plans. These should be uploaded on CQuINS either as an internet hyperlink (see above) within the key documents or in the appendix. National guidelines should have been adopted the local context should be explained. Flow charts are an acceptable means to explain details within guidelines. If it is unclear that a meeting has taken place to sign off the guidelines/policies and plans reviewers on a peer review visit may ask for minutes of the meeting.

Specific Evidence Requirements Network-wide Minimum Dataset It should be clearly shown what is collected and who is reporting on which parts of the dataset. This should include data collection for national audits where they exist. Audit Audits should be clinical rather than performance i.e. not two week waits. National audits are acceptable as a network audit but outcomes against a national audit should still be demonstrated. Dates of the meetings where audits have been presented must be clearly shown in the key documents as should any outcomes. Clinical Trials Lists of trials should show which MDTs are expected to participate in which trials. The actions should indicate how recruitment will be improved not restate the problem. Distribution lists A list of who is on the distribution list is sufficient for self assessment. It is not necessary to see proof of distribution but a team should confirm if they have distributed a document or policy.

Session 7 External Verification & Peer Review Visits

External Verification – The Purpose Verify that self-assessments are accurateCheck consistency across organisations Ensure that a robust process of self-assessment and internal validation has taken place Provide a report on performance against the measures and associates issues relating to IOG implementation Identify teams or services who will receive an external peer review visit in accordance with the selection criteria.

External Verification – The Process Desk top review of validated assessment undertaken by Zonal Quality Team Review of accuracy of self-assessmentZonal Team may request further information Zonal Team will have access to specialist clinical input and patient/carer input

External Verification – The Outcome Signed off by Quality Director and Clinical Lead If organisation unhappy with outcome there will be the opportunity for dialogue with a view to finding a solution Verified assessment scores recorded – changes will be explained on CQuINS National report uploaded to CQuINS / published

Annual Meeting with Network December each year The purpose of the meeting will be to; –inform the Zonal team of key issues within the Network such as implementation of Improving Outcomes Guidance, Service Configuration changes –discuss the teams to be visited and schedule for the following year.

Peer Review Visit Criteria Milestones not met for implementation of an IOG as agreed with CAT Immediate Risks identified at previous peer review visits that have not yet been resolved Requests from organisations i.e. SHAs, local and specialist commissioners, PCTs, Networks, Acute Trusts % compliance with measures within lowest performance groupingConcerns regarding rigor of Internal ValidationStratified random sample based on % compliance (if available capacity)

The Peer Review Visits Notified in January of each year Scheduled between April and March each financial year Each Network will be visited at same point of visit schedule each year

Notification in January to teams to be peer reviewed during May - March Deadline for submission of evidence for all teams to be visited Self Assessment evidence and compliance matrix sent to reviewers and copied to teams Visits MAY-MARCH Each Network is allocated one month. Can take from 1 to 4 weeks to complete a Network – normally 1 day per Locality Report published 8 weeks after last review day January - 2 Weeks - 4 WEEKS The Peer Review Visit Plan Preparation for review + 8 WEEKS

The Visit Day Max of 3 concurrent sessions am & pm Max of 6 teams will be reviewed in 1 day E.g. Session: 1.5 Hours Peer Review Team Preparation 1.5 Hours Peer Review Meeting with team being reviewed 1.5 Hours Peer Review Team Report Writing

Peer Review Teams Between 2 and 5 reviewers per session Plus a member of the Zonal Quality Team Reviewers should normally include “Peers” – people who are trained and working in the same discipline as those they are reviewing

Peer Review Teams May Include: User/carer MDT Lead Clinician Clinical Nurse Specialist Radiologist PathologistOncologist Medical Physicist Therapy Radiographer Oncology Pharmacist Chemotherapy Nurse Palliative Care Consultant Trust Lead Clinician, Nurse or Manager Network Lead Clinician, Nurse or Manager PCT Cancer Lead Cancer Commissioner Dietician

Which Team Members should attend the Review? MDT Review: –Lead clinician and CNS –with other core members (e.g. surgeon, oncologist, radiologist, pathologist, palliative care) –not the whole extended team NSSG Review –Chair of NSSG –Small group of other key NSSG members

Session 8 Outcomes from the Process

Outcomes from the Process Annual Network Reports National “State of the Nation” Reports Joint Working between the Care Quality Commission (CQC) and the NCPR Programme Information for commissioners

Outcomes of the Process – Network Reports Published January and June each year Including SA, IV, EV and PR Visit Assessments Executive Summary from Quality Director QD will discuss key issues with Network

Post Review Actions Recommendations from IV, EV or PR visits picked up within Work-Programmes / reported on in Annual Reports Separate process for actions regarding Immediate Risks and Serious Concerns –Written notification and written response

Session 9 Next Steps and Close

Next Steps Revised measures published on CQuINS Revised Handbook Evidence guides Example evidence documents Example reports Training Days –8 th March London –11th March Leeds –1 st April Birmingham –8 th April Taunton –13 th April London

2011/12 (INTRODUCTION YEAR)2012/13 (EVEN YEARS)2013/14 (ODD YEARS) Acute Oncology BreastAcute Oncology Chemotherapy LungChemotherapy Teenage and Young Adults ColorectalTeenagers and Young Adults Sarcoma Upper GISarcoma Brain and CNS Head and NeckBrain and CNS Gynaecology SkinGynaecology UrologyCancer Research NetworkUrology Network Service User Partnership Group RadiotherapyNetwork Service User Partnership Group Rehabilitation Children’s Complementary Therapy Cancer of Unknown Primary Psychology Specialist Palliative Care Haematology Schedule of Teams for Internal Validation

The National Schedule

Thank You Any Questions ?