Presentation is loading. Please wait.

Presentation is loading. Please wait.

Building Quality Improvement Partnerships in the VA: The Colorectal Cancer Care Collaborative QUERI National Meeting Phoenix, AZ December 2008 George L.

Similar presentations


Presentation on theme: "Building Quality Improvement Partnerships in the VA: The Colorectal Cancer Care Collaborative QUERI National Meeting Phoenix, AZ December 2008 George L."— Presentation transcript:

1 Building Quality Improvement Partnerships in the VA: The Colorectal Cancer Care Collaborative QUERI National Meeting Phoenix, AZ December 2008 George L. Jackson, Ph.D., MHA; Leah L. Zullig, MPH Adam A. Powell, Ph.D., MBA; Diana L. Ordin, MD, MPH Dawn T. Provenzale, MD, MS

2 C4: Colorectal Cancer Care Collaborative –Began in 2005 –To assess and improve the quality of colorectal cancer care from screening and diagnosis through treatment * screening * presentation with symptoms through diagnosis Phase I: Diagnosis * period from diagnosis of CRC through treatment & follow-up Phase II: Treatment

3 Time Line Summer 2005 Collaborative planning Sept. 2005- Sept. 2006 Collaborative Phase 1 (screening result diagnosis) Oct. 2006- Present Spread of lessons from Phase 1 March 2007- March 2008 Collaborative Phase 2 (diagnosis treatment)

4 Today’s Workshop How did we get started on the collaboration? Overview of Colorectal Cancer Care Collaborative Measurement challenges Building a measurement system Spreading lessons to the VA Lessens for QUERI investigators

5 Why C4? Initiated in 2005: Earlier CMO study suggested timeliness problems QUERI research results demonstrated gaps in colorectal cancer diagnosis and treatment OIG report Congressionally-mandated review of cancer care (GPRA – Government Performance and Results Act) –Colorectal, breast, lung, prostate, hematologic

6 Colorectal Cancer –Second leading cause of cancer death –Third most common type of cancer among men and women in the United States –11% of all new cancer cases –90% five-year survival when diagnosed at stage I –5% five-year survival when diagnoses at stage IV –Source: VA Colorectal Cancer QUERI Fact Sheet, January 2006 Source: VA Colorectal Cancer QUERI Fact Sheet, Jan. 2006

7 CRC Continuum +- * CDE=complete diagnostic evaluation Signs & Symptoms Initial Screen Repeat CDE* Surveillance Surgery Treatment Chemotherapy Radiation Cancer Adenomas

8 Follow-Up Positive FOBT 0 50 100 150 200 250 Facility A Facility BFacility C 64% 63% 48% 23% 95% 45% mean completion delaymean appt timemean sched delay % Scheduled Completing Diagnostic Evaluation w/in 1 Year % Patients with + Fecal Occult Blood Test Scheduled for Diagnostic Evaluation

9 Modifiable Risk Factors for Advanced CRC –549 patients –43% presented with late stage (stage III or IV) colorectal cancer –The only factor associated with presenting with late stage was not having a usual source of health care –Median patient delay – 9 weeks –Median physician delay – 6 weeks –Stage at presentation was not associated with either patient or physician delay Fisher DA, Martin C, Galanko J, Sandler RS, Noble MD, Provenzale D. Risk factors for advanced disease in colorectal cancer. Am J Gastroenterol 2004;99:2019-2024.

10 Modifiable Risk Factors for Advanced CRC Median patient delay – 9 weeks Median physician delay – 6 weeks Stage at presentation was not associated with either patient or physician delay Fisher DA, Martin C, Galanko J, Sandler RS, Noble MD, Provenzale D. Risk factors for advanced disease in colorectal cancer. Am J Gastroenterol 2004;99:2019-2024.

11 OIG Report: CRC Detection and Management in VHA Facilities Feb. 2006 Metrics to evaluate and improve CRC dx timeliness Prioritization process for dx c-scopes Directive addressing timeframes –Pt notification of screening results within 7 working days –Consistent notification and documentation requirement for dx testing

12 OQP Vision Measures and measurement tool development (QUERI/HSR&D) Pilot collaborative project to identify and develop improvement strategies/tools (OQP/SR) National dissemination of project (SR/OQP) –Monitors or Performance Measures to create “pull” for improvement –Ongoing support to facilitate sharing, identification of additional effective strategies/tools

13 OQP Vision Partnership among OQP, researchers, PCS, Advanced Clinical Access/Systems Redesign Strong, ongoing evaluation component

14 Anticipated Challenges Measurement challenges Improvement challenges Dissemination challenges Two phases: diagnosis and treatment Project infrastructure –New partnership model –“Just-in-time” planning Pace and design of project –Sense of urgency –Cultural “clashes” Research vs. operations Anecdote vs. evidence

15 Anticipated Outcomes and Products Measurement –Standardized facility-level approaches for QI measures –Real-time measurement tools –Documentation of barriers to national measurement Improvement tools/strategies Dissemination mechanism –Improvement before external review published Lessons on how to do this better next time –Project organization and partner roles –C4-type collaborative

16 The Partnership Quality Enhancement Research Initiative (QUERI) –CRC expertise in measurement and improvement Office of Quality and Performance (OQP) –Performance measurement expertise –Quality improvement expertise Systems Redesign –Expertise in delay reduction –National infrastructure, experience, and tools Patient Care Services –Clinical expertise –Link to VA clinical constituencies

17 C4 Planning Committee Organizes the collaborative Includes representatives from all partner organizations and other VA collaborative experts Subcommittees –Measurement Issues –Collaborative Operations –Dissemination

18 Optimizing the Partnership Dialogue is critical! –Initial QUERI-provided measures were critiqued by the field –C4 works with the field to develop better measures –Some may inform national data systems and some may remain local improvement tools –OQP, DUSHOM and VISN CMOs provide continued support

19 Changing Systems

20 C4 Learning Collaboratives 21 volunteer facilities (one per VISN) in diagnosis collaborative 28 volunteer facilities (at least one per VISN) in treatment collaborative Collaborative: structured, sharing with rapid cycle improvement Planning and facilitation by partner organizations with the involvement of many VA stakeholders

21 Diagnosis Collaborative 21 Improvement Teams VISN 6 Beckley, WV VISN 7 Columbia, SC VISN 8 San Juan VISN 9 Lexington, KY VISN 10Columbus VISN 11Northern Indiana VISN 20 Portland VISN 21 San Francisco VISN 22 Loma Linda VISN 23 Black Hills, SD VISN 1 Providence VISN 2 Buffalo VISN 3 New Jersey VISN 4 Pittsburgh VISN 5 Washington VISN 12 Chicago (Hines) VISN 15 St. Louis VISN 16 Houston VISN 17 Temple VISN 18 West Texas VISN 19 Salt Lake City

22 Treatment Collaborative 28 Improvement Teams VISN 6 Beckley, WV Salisbury, NC Salisbury, NC VISN 7 Columbia, SC VISN 8 Gainesville VISN 9 Lexington, KY VISN 10Dayton VISN 11Northern Indiana VISN 20 Portland Puget Sound Puget Sound VISN 21 San Francisco VISN 22 Loma Linda San Diego San Diego VISN 23 Black Hills, SD Nebraska/W. Iowa Nebraska/W. Iowa VISN 1 Providence VA Connecticut VA Connecticut VISN 2 Buffalo VISN 3 New Jersey VISN 4 Pittsburgh Lebanon, PA Lebanon, PA VISN 5 Washington VISN 12 Chicago (Hines) VISN 15 St. Louis VISN 16 Houston VISN 17 Temple VISN 18 West Texas Albuquerque Albuquerque VISN 19 Salt Lake City

23 C4 Learning Collaborative Process Flow-mapping and initial data collection –QUERI measurement using CPRS data –Local measurement Setting aims Plan-Do-Study-Act (PDSA) cycles Coaches aid in the improvement process Collaborative sharing via in-person meetings, monthly national calls, monthly reports to coaches and senior leaders, updates to VA leadership, website, and listserv

24 Team selection and commitment In-Person Meeting -Flow mapping -Baseline measures -Aim setting Plan-Do-Study-Act (changes and measurement) Structured sharing (e.g national calls) Reports to C4 and leadership PDSA Dissemination Collaborative Process

25 C4 Team Composition –Facility Management Facilities volunteered for the collaborative Applications signed by the medical center director, chief of staff, and nursing executive Sites chosen to provide size, complexity, geographic diversity –Team Formation Teams include physicians, nurses, and other representatives from the involved clinical services Designated project manager Information technology representative

26 C4 Team Activities –Flow-mapping and initial data collection –Setting aims –Plan-Do-Study-Act (PDSA) cycles –Coaches aid in the improvement process –Collaborative sharing via in-person meetings, monthly national calls, monthly reports to coaches and senior leaders, updates to VA leadership, website, and listserv

27 Model for Improvement PDSA – Rapid Cycle Improvement ActPlan StudyDo What are we trying to accomplish? How will we know that a change is an improvement? What changes can we make that will result in an improvement? PDSA slides courtesy of Jim Schlosser, MD, MBA

28 The PDSA Cycle for Learning and ImprovementAct What changes are to be made? Next cycle? Plan Objective Questions and predictions (why) Plan to carry out the cycle (who, what, where, when) Do Carry out the plan Document problems and unexpected observations Begin analysis of the data Study Complete the analysis of the data Compare data to predictions Summarize what was learned

29 Examples of PDSA Cycles Reduction of appointment types will increase appointment availability Improved access AP SD AP SD D S P A Data D S P A Cycle 1: Define a small number of appointment types and test with staff Cycle 2: Compare requests for the types for one week Cycle 3: Test the types with 1-3 providers’ patients Cycle 4: Standardize appointment types and test their use Cycle 5: Implement standards and monitor their use A P SD

30 Improve pathology reporting Shorten staging work-up Improve treatment concordance Testing Multiple Changes Improve patient education Overall Aim: improve timeliness, reliability and patient focus of CRC treatment A P S D A P S D D S P A A P S D A P S D A P S D D S P A A P S D A P S D A P S D D S P A A P S D A P S D A P S D D S P A A P S D

31 Systems Redesign/ Advanced Clinic Access Scientifically based principles System/process redesign Everything improves Requires measurement Delay elimination http://srd.vssc.med.va.gov

32 High Leverage Changes to Eliminate Delay Access –Match supply & demand daily –Reduce the backlog –Decrease appointment types –Develop contingency plans –Reduce demand –Increase supply/ Optimize the team Office Efficiency –Balance supply & demand for non- appointment work –Synchronize patient, provider, & information –Predict & anticipate patient needs –Optimize rooms & equipment –Manage constraints

33 Service Agreements Purpose –Specialists can’t do everything best –PC can’t do everything best –Best utilization of resources Elements 1.Define the work. It is not “NO” work It is not “ALL” work It is the work that only I can do (colonoscopy) 2. The sender agrees to send the right work packaged the right way. Referral templates Guideline driven All the information to safely complete the procedure 3. The receiver agrees to do the work right away

34 CRC Screening Process Problems All Along the Path Annual FOBT screen Eligible population Positive? Procedure Consult Schedule procedure Do colonoscopy Cancer? Definitive treatment Refer for treatment Yes F/U10years No Patient doesn’t return cards Long waits, backlog No show Inadequate prep

35 CRC Dx Improvement Strategies Decrease inappropriate screening Strengthen service agreements/consult templates Improve patient colonoscopy prep Track positive screens to ensure follow-up Fee-base or contract to get rid of backlog Add permanent staff Other (LOTS!!!)

36 CRC Tx Improvement Strategies CPRS enhancements (clinical reminders, quick orders, templates) are useful to ensure guideline reliability/timeliness Cancer care coordinators streamline process for patients, monitor care, and can maintain contact for lengthy surveillance

37 C4 Data Collection Phase I –Baseline data –Two process evaluation surveys –Qualitative interviews with site team leaders –Outcome data (to be collected) Phase I Spread –National facility survey –National success case method interviews –Monitor data Phase II –CCQMS dataset –Pre-intervention assessment Organizational readiness to change –Process Evaluation Survey

38 Process Change N = 128 to 131 Facilities Fully Implemented In Process of Implementing Not Implementing Strategies to decrease cancellations/no shows82%12%6% Create/revise of PC/GI service agreement64%22%14% Consult template revision59%25%16% Track colonoscopy supply and demand56%28%16% Form an multidisciplinary improvement team56%22% Revise colonoscopy prep ed and/or protocols54%21%25% Participate in an improvement collaborative51%21%28% Initiate/increase use of fee-based colonoscopies44%16%40% Revise CRC screening clinical reminder43%31%26% Create system for tracking FOBT+ patients42%38%20% Track number of inappropriate FOBTs33%28%39% Hire additional nurses/other staff for colonoscopies29%33%38% Track number of incomplete colonoscopies28%20%52% Hire additional colonoscopists23%35%42% Add additional endoscopy suites15%27%58% Contract additional onsite colonoscopists15%18%67% - Process Improvement- QI Infrastructure- GI Capacity Building

39 What have been the most significant barriers to improvement?

40 Measurement Challenges Develop a timely measure of timeliness –Establishing a reasonably short hurdle (% receiving follow-up in 60 days) better than mean/median time to follow-up Ideally the same measure(s) will be useful both within facilities (QI) and between facilities (evaluation)

41 Local FOBT Tracking Tool Features: Ease of input Tracks most relevant indicators of improvement Generates run charts Adaptable to evolving data needs Facilitates reporting of FOBT Follow-up monitor data

42 FOBT Follow-up Monitor Self-reported –Currently lack of standardization within VistA across facilities makes centralized collection of monitor data difficult –Tradeoff between rigor and data collection burden on sites Generated from local QI tracking system (Most use nationally developed FOBT Tracking tool) Evolving as definitional issues encountered

43 FOBT Follow-up Monitor Which FOBTs should be included? Inappropriate screening? (e.g. recently screened, limited life expectancy) Patient refusals? Patients going outside of VA for follow- up?

44 FY09 FOBT Follow-up Monitor Proportion of patients with a positive colorectal cancer (CRC) screening FOBT with diagnostic colonoscopy < 60 days after the positive screening FOBT. Numerator: Those in denominator who had complete diagnostic colonoscopy < 60 days after a positive CRC screening FOBT Denominator:Number of patients with a positive CRC screening FOBT in the measurement month Exclusions: –Patients who refuse follow-up colonoscopy –Patients who choose to have follow-up colonoscopy outside (i.e., neither performed nor paid for by) the VA –Patients determined to be clinically “inappropriate” for colonoscopy –Patients who have had a previous positive FOBT in the FY09 –Patients whose FOBT was not performed as a CRC screening FOBT.

45 Monitor Validation Planned independent assessments: –C4 Post-intervention Evaluation –Partin grant DETECT – “Determinants of Timely Evaluation Colonoscopy for crc+ screening Tests” –Powell CDA manual chart review project –EPRP abstraction

46 Colorectal Cancer Care Measurement System These measures, when mapped to NCCN Guidelines, will: a.Identify facility level gaps in care to patients b.Identify facility level deviations from established standards of patient care c.Identify systemwide gaps in care to patients d.Identify systemwide deviations from established standards of patient care

47 CCQMS Development Process –Solicited input from VA constituencies Office of Patient Care Services Oncology Field Advisory Committee Team of members at participating sites –Developed specific quality indicators and measures Sample quality indicator: proportion of patients with resected colon cancer with ≥ 12 lymph nodes examined by pathology –Indicators and measures form the basis for computerized measurement and analyses tools

48 CCQMS Development Process –Facilities collect measurement data from VA computer systems –Information to C4 participants –During the improvement collaborative, facility and VA-wide reports are being produced –A goal is to increase data extraction capabilities during the time of the collaborative –Potential to serve as a model for other cancer care quality measurement efforts

49 National Host (HSR&D / OQP) Individual Site CCQMS Operational Design through CCQMS reporting feature

50 CCQMS Data Entry

51 CCQMS Reporting Feature –Immediate feedback on concordance with NCCN guidelines and their progress in meeting the quality indicators –Displays facility de-identified data for reference and comparison

52 3 CCQMS Quality Indicator Reports

53

54 CCQMS Timeliness Reports

55 Colorectal Cancer Care Quality Measurement System Next Steps Discussions with VA leadership regarding national dissemination of the tool Partnership with Department of Defense – Collaborators at Walter Reed Army Medical Center Survey Component – University of Minnesota – Patient/Family Experiences Explore options for use in private health care systems

56 Working with the Teams Collaborative work is different than traditional research –Considerable “people-factor” in dealing with multidisciplinary groups –Communication and flexibility are key –Balance between rapid cycle improvement and rigor of data collection –End goal is impact & improvement rather than publication

57 National Dissemination DUSHOM Monitor –Local measurement tool –Quarterly feedback of aggregate results (FOBT measure) –Provide baseline data (CRC treatment) Improvement facilitation –Improvement guide from collaborative –Monthly phone calls –Listserv –Coaching (not yet available)

58 FOBT Follow-Up Monitor Year 1 (FY2007) –Q1: flow map –Q4: % FOBT-positive patients with follow-up within 60 days; improvement progress report Year 2 (FY2008) –Q2-Q4: % FOBT-positive patients with follow-up within 60 days (revised); improvement self- assessment Year 3 (FY2009) –Q2-Q4: % FOBT-positive patients with follow-up within 60 days (revised) –Q4: improvement self-assessment

59 DUSHOM FY2009 CRC Diagnosis and Treatment Monitor Opportunity to spread improvement nationally –Facilities begin to look at their processes for colorectal cancer care –Medical centers identify improvement opportunities, collect data on an indicator in their area of opportunity, begin improvement work Tools/lessons available from the collaborative

60 CRC Treatment Monitor – FY2009 Q2: team, aim, flow map to SR POC Q3: improvement plans for targeted area, including measures to SR POC –Measures menu and tool available Q4: improvement progress report

61 C4 Products Models for improving cancer treatment Measurement systems Models for developing surveys Protocols for collaborative development National dissemination of cancer dianosis and care improvment

62 Research Working with Operations: C4 Lessons Learning who is who in the VA –Central Office, VISNs, facilities Integrating C4 and changes into facility workflow Tremendous differences among VA facilities –Organization of care, information technology, services provided Managers at different levels have different needs Information technology changes must be considered

63 Comparing Research & Operations Research –Primary goal is to improve the care of veterans and others –Focus on generalizable knowledge –Pressure to publish and get grants –Generally smaller number of players Operations –Primary goal is to improve the care of veterans and others –Focus on implantation in real time –Pressure to respond to organizational demands and stakeholders –Huge number of players

64 Lessons for Investigators Wonderful opportunity to make a difference in patient care Allows for an extensive network to be built Can take up a great deal of time –Often responding to the needs of many different stakeholders Researchers have different career goals than collaborators Can require creativity to get academic products –Can still take a longer time than usual

65 C4 Planning Committee Jacki Bebb, BSB/M Hanna E. Bloomfield, MD, MPH Deborah Cortez, MPH, CHES Cody Couch Michael Davies, MD Carrie Dekorte, PharmD Jill Edwards, NP Fabiane Erb, BA David Haggstrum, MD, MAS Theresa Hellings, RD Janis Hersh, MBA John Inadomi, MD George L. Jackson, Ph.D., MHA Michael Kelley, MD Laura Kochevar, Ph.D. Nancy Koets, PsyD Odette Levesque, RN, MBA Heidi L. Martin, MPH Irrma McCaffrey, BA Peter Mills, Ph.D., MS Jeffrey Murawsky, MD RimaAnn O. Nelson, RN, MPH Dede Ordin, MD, MPH Renee Parlier, RN, MPA George Ponte, Ph.D. Adam A. Powell, Ph.D., MBA Dawn T. Provenzale, MD, MS James Schlosser, MD, MBA Leah L. Zullig, MPH

66 CCQMS Financial Support –HSR&D grant (CRT 05-338) –VA CDA (MRP 05-312) –NCI-VA IAAs (Y1-PC-8218-01; V246S-00054) –Sole Source Contract (HHSN261200800504P)

67 Durham VAMC, Center for Health Services Research in Primary Care (HSR&D) Dawn T. Provenzale, MD, MS – Principal Investigator George L. Jackson, PhD, MHA – Co-Investigator & Project Director Leah L. Zullig, MPH – Project Manager Bryan Paynter, BS – Lead Programmer Radhika Khwaja, MD – Clinical Coordinator Adam Powell, PhD, MBA – Evaluation Coordinator* Yousuf Zafar, MD – Medical Oncologist Ziad Gellad, MD, MPH – Gastroenterology Fellow Melissa Garrett, MD – Gastroenterology Fellow Natia Hamilton, BA – Research Assistant Chris Newlin, MPH – Research Assistant Michelle van Ryn, PhD – Co-Investigator * – Survey Component * Minneapolis VAMC, Center for Chronic Disease Outcomes Research (HSR&D) CCQMS Development Team

68 Contact Information George Jackson - george.l.jackson@duke.edu Leah Zullig – leah.zullig@va.gov Adam Powell – adam.powell@va.gov Dede Ordin – diana.ordin@va.gov Dawn Provenzale – prov002@mc.duke.edu


Download ppt "Building Quality Improvement Partnerships in the VA: The Colorectal Cancer Care Collaborative QUERI National Meeting Phoenix, AZ December 2008 George L."

Similar presentations


Ads by Google