Metrics for 211 Centers and Systems Policy Dialogue with Maribel Marin Executive Director, 211 LA County – CAIRS President 29 th I&R Annual Training and.

Slides:



Advertisements
Similar presentations
COUNTER: improving usage statistics Peter Shepherd Director COUNTER December 2006.
Advertisements

1 Update on Bicyclist & Pedestrian Data Collection and Modeling Efforts Transportation Research Board January 2010 Charlie Denney, Associate Michael Jones,
Identifying, Monitoring, and Assessing Promising Innovation: Using Evaluation to Support Rapid Cycle Change July Presentation at a Meeting sponsored.
5/21/ LA County1 211Centers Doing it – The How of Metrics Amy Latzer Chief Operations Officer, 211 LA County 30 th I&R Annual Training and Education.
6/3/ LA County1 The How of Metrics What to Collect and How to Use it Amy Latzer Chief Operations Officer, 211 LA County 31st I&R Annual Training.
Program and Implementation. Plans for the next 90 minutes and beyond! Define quality assurance for a program Implementation plans Utilization of.
2-1-1 in Minnesota/Western Wisconsin Received accreditation by AIRS in 2008 Staffed 24 hours/7 days a week Serves all 87 counties in Minnesota; Douglas,
6/2/ LA County1 Quality, Training and Coaching: Best Practices for I & R Contact Centers Amy Latzer Chief Operations Officer 31 st I & R Annual.
External Quality Assessments Frequently Occurring Findings Observed by The IIA QA Teams.
CRI- Common Review Initiative Reducing Lender Review Redundancy.
1 Mid-Term Review of The Illinois Commitment Assessment of Achievements, Challenges, and Stakeholder Opinions Illinois Board of Higher Education April.
Title X Objectives How Writing Measurable Objectives Helps DSHS Evaluate the Success of Your Title X Project.
IMPLEMENTING EABS MODERNIZATION Patrick J. Sweeney School Administration Consultant Educational Approval Board November 15, 2007.
Information and Referral: Call Center Proposal Board of Early Education and Care December 8, 2009.
SPRING CREEK ELEMENTARY Title I For additional information contact the school at
UCSC History. UCSC: A brief history 60s University Placement Committee A lot of field trips/interaction with employers.
Town Hall Presentation January 9-10, 2002 Curtis Powell Vice President for Human Resources The Division of Human Resources and William M. Mercer, Incorporated.
Strategic Visioning Process Pleasant Valley District #62
1 Quality Indicators for Device Demonstrations April 21, 2009 Lisa Kosh Diana Carl.
Introduction to Homeless Management Information Systems (HMIS)
Dr. Catherine Whiting Medical Officer of Health North Bay & District Health Unit PERFORMANCE MEASUREMENT AT THE LOCAL HEALTH UNIT LEVEL Panel Presentation.
Unit Name Goes Here Data This, Data That Using ISO to Develop Performance Indicators for Library Wide Planning & Data Comparison Presentation for.
Patient Web Portals: What’s the Convenience Worth to Patients? Kenneth Adler, MD, MMM Medical Director of Information Technology Arizona Community Physicians.
Managing and measuring organizational performance Brent Stockwell, Strategic Initiatives Director Scottsdale City Manager’s Office
F OR -P ROFIT T HINKING IN A N ON -P ROFIT W ORLD I DEAS FOR UTILIZING CORPORATE CALL CENTER PRACTICES TO IMPROVE YOUR NON - PROFIT CALL CENTER.
AACE Goals Goals as identified by AACE’s Board of Directors for
SQM Group Copyrighted - No Reproduction 1 Building Client Satisfaction Through Your Call Centre Citizen Satisfaction – Measures of Success Conference Presentation.
SEM Planning Model.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Center for Health Care Quality Licensing & Certification Program Evaluation 1 August 2014 rev.
Options for Evaluating the Performance of a Tax Administration Agency
Grants Business Process Re-Engineering (BPR) Overview
Call Center to Contact Center 1 Contact Center Organizational Chart 2 Donna Stone Buchanan Chief Operating Officer Outreach Marketing Manager Database.
Molly Chamberlin, Ph.D. Indiana Youth Institute
1 EEC Board Policy and Research Committee October 2, 2013 State Advisory Council (SAC) Sustainability for Early Childhood Systems Building.
Call Center Terminologies
Bogdan Lazaroae: Using technology for improved decision making Bucharest, Romania, May 30, 2007 From Call Data.
Use of OCAN in Crisis Intervention Webinar October, 2014.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
1 A B C’s of Performance Based Contracting A Presentation for the Georgia Child Welfare Private Providers Provider Summit Held at the Wyndam Hotel September.
Program Collaboration and Service Integration: An NCHHSTP Green paper Kevin Fenton, M.D., Ph.D., F.F.P.H. Director National Center for HIV/AIDS, Viral.
Community Care Coordination and Case Management Kansas Public Health Association, Inc Fall Conference.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
Eugene Area Chamber of Commerce Mission To promote a HEALTHY LOCAL ECONOMY by influencing *Business success *Public policy *Community development.
Data Quality: Treasure in/Treasure Out Victoria Essenmacher, SPEC Associates Melanie Hwalek, SPEC Associates Portions of this presentation were created.
Comprehensive Review of the Public Safety Communications Center Phase 2 Report County of Dane, WI February 12, 2009.
U.S. Department of Transportation Pipeline and Hazardous Materials Safety Administration USDOT – PHMSA HMEP Grants Major Audit Findings NASTTPO April 25,
Session 4 Engagement, Continuous Improvement, and Accountability CLAS Training [ADD DATE] [ADD PRESENTER NAME] [ADD ORGANIZATION NAME]
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
1 What’s Next for Financial Management Line of Business (FMLoB)? AGA/GWSCPA 6 th Annual Conference Dianne Copeland, Director, FSIO May 8, 2007.
2013 NEO Program Monitoring & Evaluation Framework.
Goal Setting and Continuous Improvement.  What will be the goals you set that make a difference for your customers?  What role will you play?  With.
New Frameworks for Strategic Enrollment Management Planning
New England Region Homeless Management Information System PATH Integration Into HMIS Richard Rankin, Data Remedies, LLC Melinda Bussino, Brattleboro Area.
USING PERFORMANCE DATA TO WIN SUPPORT BARBARA EMERSON, FAIRFAX COUNTY, VIRGINIA SEPTEMBER 2004.
Catholic Charities Performance and Quality Improvement (PQI)
Statewide Radio Feasibility Study (SIRN) Presented by Tom Harris SIEC Chair Mike Ressler.
COMPREHENSIVE PROGRAM EVALUATION Internal Evaluation Sherri Kendall, LPC CEO, Aid to Victims of Domestic Abuse Houston External Program Evaluation / Research.
The Los Angeles Public Health Leadership Institute: An Intra-organizational Approach To Leadership Development APHA Session: The Challenge of Leadership.
Homeless Management Information Systems The Calgary HMIS - A joint initiative between the CHF and the Homeless Serving Sector in Calgary Date: April 21,
The Federal Telework Program U.S. Office of Personnel Management.
Section 1 of the Universtal Standards Define and Measure Social Goals 1.
1 CASA Reports From Registry Data Making it Easier for Providers Central Valley Immunization Information System San Joaquin Valley Health Consortium California.
2-1-1 Illinois Business Plan Steering Committee April 12 th, 2010.
Abstract # TUAE0102: Health Services Reporting Tool Helps ASOs and Funders Meet Accountability Requirements, Monitor Programs and Identify Emerging Trends.
Metrics for 211 Centers and Systems
State Coordinator Intervention
FCYO LEARNING AND EXCHANGE FUND
Meeting Planners Association
Presentation transcript:

Metrics for 211 Centers and Systems Policy Dialogue with Maribel Marin Executive Director, 211 LA County – CAIRS President 29 th I&R Annual Training and Education Conference May 7, 2007

Policy Objectives Identify a specific set of metrics that all 211 centers can measure and track. Provide a clear definition of each metric so that they measure apples to apples. Communicate a clear scope for 211 service to the public and for funders. Enable regional, statewide, and national calibration, evaluation, and assessment of 211 services.

What is a metric and what is its value? A metric is a measure of activity or performance that enables the assessment of outcomes. Metrics can help to answer key questions about operational effectiveness: Are long term goals and objectives being achieved? What does success look like? How satisfied are callers with services? How important is the service to the community? How effective are managers and specialists?

What are the Benefits? Enhanced decision making – goals can be set for desired results, results can be measured, outcomes can be clearly articulated. Improved internal accountability – more delegation and less micro-management when individuals are clear about responsibilities and expectations. Goals and strategic objectives are meaningful – tracking progress enables the evaluation of planning efforts and can aid in determining whether a plan is good or not. Source: Wayne Parker, Strategic Planning 101 – Why Measure Performance? Workstar Library 2003

Why Metrics Matter for 211? Most 211s collect or have the ability to collect voluminous amounts of data through their Call Management Systems (CMS), Automated Call Distribution systems (ACDs), and/or their I&R software. Much data is collected for purposes of complying with AIRS standards, particularly for agency accreditation or reporting to boards, funders, and/or contracts. We have yet to realize the full value of this data for benchmarking (comparing performance against goals/mission and/or industry peers) and for its potential for aggregation across regional, state, and national levels.

Why Metrics Matter for 211? Through benchmarking and aggregation of data across the field, a story can be told about the value of 211 service as a social safety net for the entire nation on a daily basis and during times of crisis and disaster. The national 211 business plan clearly calls out the need for creating a unified system through the development of industry standards in order to avoid misuse of 211 (too broadly or too narrowly defined) and create sustainability for the total system. Funding, funding, funding – aggregate data and demonstrable performance outcomes enable pursuit of system-wide funding strategies.

AIRS Data Requirements: Reports and Measures Service Requests Referrals Provided Service Gaps Demographic Data Zip Code City Age Gender Language Target Population First Time/Repeat Caller Follow up

AIRS Standards National Reporting Total Calls Answered/Handled Services Requested

Commercial Call Center Metrics Call center trade journals* consistently identify the following Key Performance Indicators (KPIs) as leading performance metrics or benchmarks: Service Level (% of calls responded to within a specified timeframe such as 80% in under 60 seconds) Speed of Answer (time caller waits in queue before live answer) First-Call Resolution (% of callers helped with one call not requiring repeat call on same issue) Adherence to Schedule (actual and scheduled work by time of day and type of work – handling calls, attending meetings, coaching, breaks, etc.) Forecasting Accuracy (measured on 2 levels: actual vs. forecasted call volume for hiring/recruiting and for existing staff schedules) Handle Time (includes talk time, hold time, and after call work) Customer Satisfaction Cost per Call Abandonment Rate (calls in queue disconnected by caller) * ICMI Call Center Magazine, Contac Professional, Benchmark Portal, Customer Operations Performance Center Inc. (COPC Inc.) incoming.com,

Govt/Non-Profit Gold Standard MetricAvgBest Service level (80% of calls answered) 62.8 sec21.3 sec Avg Speed of Answer 59.4 sec18.6 sec Avg Handle Time7.1 min6.9 min First Call Resolution 49.1%65.3% Abandonment Rate 9.18%5.46% Schedule Adherence 70.1%73.3% Cost per Call$6.31$3.52 MetricAvgGold Speed of Answer 40 sec31 sec Avg Call Length 7.04 min4.44 min % Very Satisfied 52%75% Benchmark Portal (2005) Govt & NonProfit industry benchmark report: Best-in- class call center performance Benchmark Portal (2003) Govt & NonProfit Industry Benchmark Report

Best Practices Study on Customer Service – City of Los Angeles

Common 211 Metrics Quanity of Service Number of Calls Handled/Answered Referrals Provided to Callers Call Length Service Gaps Quality of Service Speed of Answer/Service Level Abandonment Rate/Dropped Calls Follow up Rate Caller Satisfaction

211 Recommended Best Practices Agency Accreditation (AIRS Standards compliance) CIRS/CRS certification 24/7 service Universal Access – cell, TTY, cable, web Service provided by a live, trained I&R Specialist without being required to leave a message or hang up and dial a separate number. Multilingual service Call monitoring

211 LA County Metrics MetricTargetActual (1 st Qtr 2007) Service Level/Speed of Answer80% in 60 sec92% Abandonment Rate< 10%3.5% Satisfied with Services95%93% Follow-up Rate (non-crisis 211)3 calls/CRA per month3+/CRA/mo Average calls monitored2 calls/CRA per week2/CRA/wk # of new programs/services added to database each year (for FY 06-07) 10% increase per year3.6% Annual Survey Response Rate (figures for June-July fiscal year - 1 qtr remaining in process) 1 st Mailing: 60% 2 nd Mailing: 20% Phone Contact: 20% 49.96% 18.44% 9% # of agency site visits per year5020 % of eligible CRAs AIRS certified100%81% Employee turnover rate< 10%7.7%

Standardizing 211 Metrics Attempts to generate national reports on performance and service outcomes to support funding requests have been challenged by the lack of common definitions for basic measurements. Many 211 systems are attempting to bridge the differences through data standardization processes – just a few referenced here: Texas I&R Network: Dr. Sherry Bame, Texas A&M University United Ways of Ontario, Canada: Michael Milhoff, consultant WIN 211 (Washington and Oregon): Karen Fisher, Associate Professor, University of Washington Information School (211 Outcomes Registry project) IN 211 (Indiana): through its 211 operations manual development process AIRS Accreditation Committee: through 2007 revised standards led by Faed Hendry, chair and Manager of Training and Outreach at Findhelp Information Services of Toronto.

Challenges to Standardization Limited data collection/reporting capability Lack of common terms and definitions Lack of standard call types/needs lists Inconsistent measurement/reporting frequencies Too much variation in data collection fields among I&R software systems Varying data sources: I&R software vs. CMS/ACD Too much data collection not enough data analysis

Key Consistency Questions How is a call defined? Are demographics taken on caller or client? Are multiple clients on one call counted as a single transaction? A single call? Is age data collected by number of years, age range/group, birth date? Is location data related to caller or client? Is it location of call or location of residence? Data reported daily, monthly, quarterly, annually?

Why Metrics Matter 1. What you dont measure doesnt count. 2. What gets measured gets done. 3. If you dont measure results, you cant tell success from failure. 4. If you cant see success, you cant reward it. 5. If you cant reward success, youre probably rewarding failure. 6. If you cant see success, you cant learn from it. 7. If you cant recognize failure, you cant learn from it. 8. If you can demonstrate results, you can win public support. Source: Aiming to Improve, Audit Commission, Reinventing Government, Osborne and Gaebler

Next Steps Join 211 North America Metrics Project Share findings of your regions work Weigh in on which metrics should be collected by 211s Voice your position on how the scope of 211 service should be defined Participate in discussions on: Common definitions and terms Standard call types/needs lists Measurement/reporting frequencies Demographic data best practices

Sources and References Websites (Queue Tips) Customer Operations Performance Center Inc. (COPC Inc.) is a leading authority on operations management and performance improvement for buyers and providers of customer contact center services The Interagency Working Group (IAWG) on U.S. Government-Sponsored International Exchanges and Training created in 1997 for improving the coordination, efficiency, and effectiveness of United States Government-sponsored international exchanges and training. Section on 211 Standards

Sources and References Reports National 211 Benchmark Survey – conducted by 211 San Diego ( in 211 standards section) Devising, Implementing, and Evaluating Customer Service Initiatives (1/30/2007) – Office of Citizen Services & Communications: US GSA Agency Experiences with Outcomes Measurement – United Way of America (2000) WIN Performance Evaluation and Cost-Benefit Analysis of I&R Systems – University of Washington, Information School – Karen Fisher/Matt Saxton (2005) National 211 Business Plan – AIRS/UWA (2002) 211 Data Entry & Database Coding for TIRN – Texas A&M University – Dr. Sherry I. Bame 211 Across California by 2010 – Business Plan: 211 California Partnership with Sadlon & Associates, Inc. AIRS Standards for Professional Information and Referral, Version 5.1 (2006) Operations Manual – Indiana 211 Partnership, Inc. - First Approved 7/09/02; last revision 5/9/06 COPC-2000 CSP Gold Standard Release 4.1 (January 2007)

Contact Information: Maribel Marin Executive Director 211 LA County (626) Real People. Real Answers. Real Help. INFORMATION AND REFERRAL FEDERATION OF LOS ANGELES COUNTY Serving Los Angeles County since 1981