1 Linking Performance Measures to Benchmarks in the Budget Process March-April 2002 Department of Administrative Services Oregon Progress Board www.econ.state.or.us/opb.

Slides:



Advertisements
Similar presentations
A Study of State and Local Implementation and Impact The Study of State and Local Implementation and Impact of the Individuals with Disabilities Education.
Advertisements

Introduction to Performance Measurement for Senior Corps Project STAR Support and Training for Assessing Results Recorded 9/10/2007.
Overview of Performance Measurement. Learning Objectives By the end of the module, you will be able to: Describe what performance measurement is, and.
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION FOR NATIONAL & COMMUNITY SERVICE.
OECD World Forum Statistics, Knowledge and Policy, Palermo, November Using Indicators to Engage Citizens: The Oregon Progress Board Experience.
North Carolina Department of Public Instruction Division of Accountability Services/Testing Section September, Hope Tesh-Blum Division of Accountability.
CITIZEN REVIEW PANEL Theresa Costello, MA Director National Resource Center for Child Protective Services.
WIA Performance and Common Measures Where are we now? by Anthony L. Joseph, Ph.D. Program Manager Workforce Development & Training Division, NYSDOL.
WE BUILD A BRIGHTER FUTURE together American Hospitals Association Annual Meeting April 29, 2013 Raymond J. Baxter, PhD Senior Vice President, Community.
Copyright © 2012 California Department of Education, Child Development Division with WestEd Center for Child & Family Studies, Desired Results T&TA Project.
Program Goals, Objectives and Performance Indicators A guide for grant and program development 3/2/2014 | Illinois Criminal Justice Information Authority.
1 Mid-Term Review of The Illinois Commitment Assessment of Achievements, Challenges, and Stakeholder Opinions Illinois Board of Higher Education April.
Undergraduates in Minnesota: Who are they and how do they finance their education? Tricia Grimes Shefali Mehta Minnesota Office of Higher Education November.
Mini-Grant Application: Quality Improvement in the Area of Immunizations Catherine Shoults, M.P.H., Kansas Health Institute Kansas Public Health Conference.
1 Career Pathways for All Students PreK-14 2 Compiled by Sue Updegraff Keystone AEA Information from –Iowa Career Pathways –Iowa School-to-Work –Iowa.
1 Adequate Yearly Progress (AYP) U.S. Department of Education Adapted by TEA September 2003.
Center for Performance Management Strategic Planning Session Wichita Hyatt Regency September 20, 2011 – 8:30 a.m. – 12:15 p.m. 1.
Using outcomes data for program improvement Kathy Hebbeler and Cornelia Taylor Early Childhood Outcome Center, SRI International.
1 Understanding Multiyear Estimates from the American Community Survey.
Child Care Subsidy Data and Measurement Challenges 1 Study of the Effects of Enhanced Subsidy Eligibility Policies In Illinois Data Collection and Measurement.
Retooling Transitional Housing
2009 Strategic Planning playbook
Greater Minnesota Transit Investment Plan PAC December 14, 2010.
Introduction AmeriCorps State & National 1 The following presentation will guide AmeriCorps State and National Program users through how to create Applicant-Determined.
1 NM Behavioral Health Collaborative New Mexico Behavioral Health Plan for Children, Youth and Their Families March 2007.
2010 SACS-COC Annual Meeting December 6, 2010 CS-69 Administrative Program Review Assuring Quality in Administrative and Academic Support Units.
Developing and Implementing a Monitoring & Evaluation Plan
Directions for this Template  Use the Slide Master to make universal changes to the presentation, including inserting your organization’s logo –“View”
What is Pay & Performance?
Southeastern Association of Educational Opportunity Program Personnel 38 th Annual Conference January 30 – February 3, 2010 Upward Bound Internal & External.
Miami-Dade County Florida Juvenile Justice Model
Strategic Financial Management 9 February 2012
7/16/08 1 New Mexico’s Indicator-based Information System for Public Health Data (NM-IBIS) Community Health Assessment Training July 16, 2008.
RTI Implementer Webinar Series: Establishing a Screening Process
Annual Industry Accounts Overview George Smith & Nicole Mayerhauser Current Industry Analysis Division Bureau of Economic Analysis Industry Accounts Users’
Creating a Culture of Quality Improvement
Management Plan: An Overview
1 Academic Support Grantee Data Training Academic Support Grantee Data Training Department of Education Hoagland-Pincus Center (UMass) Shrewsbury, MA January.
To Create and Sustain a Career Pathway. CTE Works! Summit November 13, 2014.
1 Developing EPA’s Peer Review Program Joint JIFSAN/SRA/RAC Symposium Dorothy E. Patton, Ph.D., J.D. September 30, 2003.
Training objectives & notes to the presenter… This training module is designed for an administrative commissioner ( Council Commissioner, Assistant Council.
1 What Counts: Measuring the Benefits of Early Intervention in Hawai’i Beppie Shapiro Teresa Vast Center for Disability Studies University of Hawai`i With.
State Plan for Independent Living UPDATE Overview, Impact and Involvement.
1 Commonwealth of Virginia Executive Branch Strategic Planning, Service Area Planning, and Performance-Based Budgeting Agency Strategic & Service Area.
1 Commonwealth of Virginia Executive Branch Strategic Planning, Service Area Planning, and Performance-Based Budgeting Agency Strategic & Service Area.
Kathy Keeley Northland Foundation Strengthening Communities April 14, 2011 Writing Result Measures and Program Evaluation.
Budgeting for Results Outcomes Based Budgeting
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Molly Chamberlin, Ph.D. Indiana Youth Institute
Performance Management Measuring Performance Using Information to Improve Performance.
To provide new information for anyone who is familiar with the former School Improvement process To discuss the information that has been added to,
Local Early Childhood Advisory Councils Orientation Meeting October 1, 2012 Local Early Childhood Advisory Councils Orientation: 5 Step Process Valerie.
Creating a New Vision for Kentucky’s Youth Kentucky Youth Policy Assessment How can we Improve Services for Kentucky’s Youth? September 2005.
1 Commonwealth of Virginia Executive Branch Strategic Planning, Service Area Planning, and Performance-Based Budgeting Agency Strategic & Service Area.
1.  Biennial Budget was first step 2  Lessons learned ◦ Good start ◦ Too detailed ◦ Lacked overarching structure ◦ Need to refine  AB 248.
National Performance Reporting Standards in OR Presentation to The Advisory Committee on Citizen- Friendly Performance Reporting By Rita Conrad OR Progress.
Family Service System Reform Grant Application Training Video FY Donna Bostick-Knox, Pennsylvania Department of Public Welfare, Office of Children.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
Oregon State Government - Performance Reporting in Six Steps Presentation by Rita Conrad to: Advisory Committee on Citizen-Friendly Performance Reporting.
National Center for Civic Involvement (NCCI) Grant Advisory Council on Citizen-Friendly Reporting Last Meeting! August 30, 2005.
1 Board of Early Education and Care EEC Annual Legislative Report: Update March 10, 2009.
Indicator B5: LRE Regional SPR&I Trainings.
Comprehensive Youth Services Assessment and Plan February 21, 2014.
Developing Effective Performance Measures: A Skill-Building Workshop David J. Bernstein, Ph.D., Senior Study Director, Westat Sponsored by the Government.
John Halpin, Associate Dean, Perkins & Work Experience
Monitoring and Evaluation using the
SPR&I Regional Training
New Jersey Department of Education
Presentation transcript:

1 Linking Performance Measures to Benchmarks in the Budget Process March-April 2002 Department of Administrative Services Oregon Progress Board

2 Overview Why measure performance? Why Oregon Benchmarks? What makes a good performance measure? What is required in the budget process? Getting started

3 Handouts Logic models –Logic model worksheets (yellow) –Logic model examples (ochre) Submission forms –Links to Oregon Benchmarks (blue) –Performance Measure Data Summary (green) Evaluation forms –PM criteria worksheet (off-white) –Today’s training evaluations (purple)

4 Why measure performance? It’s at the core of results-based management Provides greater accountability Is the ship on course? Fosters internal learning and improvement Is the ship running well? AND…it has been required since See Appendix B.

5 Why link to Oregon Benchmarks? They articulate Oregon’s hopes and expectations. “High-level outcomes” or measures of societal well-being. Beacons for the “ship” and the “fleet”. For budget, link only to those that relate to your core mission and goals (“primary linkages”).

6 Oregon has ninety benchmarks in three broad categories. Economy Education Civic Engagement Social Support Public Safety Community Development Environment

7 What happens if your agency does not link to an Oregon Benchmark?. That’s OK. You have two options: –You may submit other high-level outcomes to gauge how Oregon is doing relative to your mission. –Small agencies: if this is not feasible, you can “look up” to your mission and/or mandate. All high-level outcomes should pass the “so what” test. Do Oregonians care? So what??

8 Logic models define the links. Goal (generally unmeasurable) Performance Measures Impact Intermediate Outcome Measures Agency Inputs and Activities Output Measures High-level outcome(s) (measurable)

(Increase) % of offenders with intake assessments Output “So That” % of offenders engaged in work, training, education and/or treatment (is increased) Intermediate Outcome “So That” % of offenders showing a measurable improvement in behavior and/or skill level (is increased) Intermediate Outcome “So That” % of paroled offenders convicted of a new felony within three years (is decreased) High-Level Outcome (Benchmark #61) A logic model embeds a continuum of measures in a“so that” chain.

10 What makes a good performance measure? BASIC criteria required for Performance measures should: 1.Use GASB* terms and definitions 2.Gauge progress towards agency goals and benchmarks or other high-level outcomes 3.Focus on a few key indicators 4.Have targets 5.Be based on accurate and reliable data *Governmental Accounting Standards Board

11 OUTCOME = Result (the best kind of measure) –High-level (societal) = OBM#11, Per capita income –Intermediate = Average wage of agency job placements OUTPUT = Product or service (“widget”) –# of job placements per quarter INPUT = Time, money, material or demand –FTEs in the “Job Placement Unit” –Dollars allocated to the “Job Placement Unit” –Case load or number of complaints –INPUTS ARE NOT STAND-ALONE PERFORMANCE MEASURES EFFICIENCY = Input per output –# of days required to process a job application Basic criteria #1. Use GASB definitions

12 Two kinds of intermediate outcomes: chunks and stones EXAMPLE: Benchmark #18, Ready to Learn 1.A “chunk” of the population is measured for the high-level outcome (HLO) % of children of served families who are ready to learn (versus % of all children in the county who are ready to learn) 2.“Stepping stone” toward the HLO is measured. % of trained parents who read regularly to their children (reading to kids is a stepping stone to being ready to learn)

# of intake assessments completed Output % of offenders engaged in work, training, education and/or treatment Intermediate Outcome % of offenders showing a measurable improvement in behavior and/or skill level Intermediate Outcome % of paroled offenders convicted of a new felony within three years High-Level Outcome (Benchmark #64) Basic criteria #2. Measure progress towards agency goals and benchmarks Goal to “reduce repeat offenders” is UNMEASURABLE MEASURES gauge progress

14 Basic criteria #3. Focus on a few key measures. Represent the scope of agency responsibility Number 30 max (except for mega-agencies) Include the best measures for: –“Is the ship on course?” –“Is the ship running well?” Additional measures internal to your agency can provide more detailed management information.

Agencies should decide how “high up” to go for their key measures. More agency influence More policy intent Consider level of agency INFLUENCE # of intake assessments completed Output % of offenders engaged in work, training, education and/or treatment Intermediate Outcome % of offenders showing a measurable improvement in behavior and/or skill level Intermediate Outcome % of paroled offenders convicted of a new felony within three years High-Level Outcome (Benchmark #64)

16 Basic criteria #4. Performance measures should have targets. TARGET = Desired level at any given point in time Should be ambitious but realistic Target setting is an art and a science based on –trend data –comparisons –expert opinion Targets not required until Jan Recidivism now Recidivism TARGET

17 Basic criteria #5. Accurate and reliable data. Without trustworthy data, the system is meaningless. Example: verifiable employment records are better than estimated job creation Each measure should have at least one data point, preferably several. Data should describe what is being measured.

18 Performance measure criteria ADVANCED = required for biennium Performance measures should: 6.Link to an organizational unit 7.Cover organizational outcomes like efficiency and customer satisfaction 8.Allow comparisons More training on Advanced Criteria later

Annual Performance Reports submitted to DAS/LFO. (Annually in September) Submit Links to Oregon Benchmarks (March - August 2002) TA & Training on Performance Measures Budget Instructions Comments & Measures Accompany Governor’s Recommended Budget (November 2002) See Guidelines pp.10 & 11 Budget Timeline for Performance Measures (April – August 2002) Adjustments (Optional) Performance Measure Data Summary to Ways & Means (January - June 2003 ) Agencies adjust measures and targets per legislature (June 2003) Criteria-based review (April – Aug. 2002)

20 Hypothetical example #1 Impact AGENCY INPUT/ACTIVITY Award grants to local contractors to conduct “best practice” juvenile crime prevention programs (JCP). INTERMEDIATE OUTCOMES % of juveniles in JCP programs with significantly mitigated risk factors. GOAL Reduce juvenile crime. HLO Juvenile Arrests (OBM#61) Agency Performance Measures OUTPUTS # grants awarded by county # days of TA delivered by county

21 Hypothetical example #2 Impact AGENCY INPUT/ACTIVITY Award grants to local contractors to design/deliver “best practice” parent education classes. INTERMEDIATE OUTCOMES % of children from participating (trained) families entering school ready to learn. GOAL Healthy, thriving children. HLO: % of kindergarteners ready to learn (OBM#18) Agency Performance Measures OUTPUTS # grants awarded by county. “Best practice” guidelines done by

22 Hypothetical example #3 Impact AGENCY INPUT/ACTIVITY Jointly sponsor, with cities, regional educational events for private citizens every quarter. INTERMEDIATE OUTCOMES % participating citizens with improved understanding Customer satisfaction ratings GOAL: Citizen involvement (C.I.) in land use planning HLO: % of cities with neighborhood organizations. Agency Performance Measures OUTPUTS # citizens trained. # C.I. guidelines distributed.

Related Oregon Benchmarks (OBMs) or High-Level Outcomes (HLOs): % of cities with active neighborhood organizations Agency Goal OBM# HLO# Key Performance Measure PM # PM Since New or Mod.? 2000 Valu e 2005 Target Lead Division or Unit (Optional) Citizen involvement in land use planning 1 Percent of participants with improved understanding Ag# New 55 % 70 % Communications Pertinent Benchmark or High-level outcome(s): Links to Oregon Benchmarks Form HLO 1 - Percent of cities with active neighborhood organizations.

Performance Measure Definition (numbered as shown below) DataTargets Agency # - 1 Agency # - 2 Agency # - 3 Agency # - 4 Agency # - 5 Agency # - 6 Agency # - 7 Agency # - 8 Performance Measure Data Summary (for Ways and Means) 55%62%70%60%65% Percent of participants with improved understanding

25 Helpful websites  Governmental Accounting Standards Board GASB home page  National Center for Public Productivity, Rutgers A Brief Guide to Performance Measurement in Local Government (1997)  John F. Kennedy School of Government, Harvard An Open Memorandum to Government Executives - Get Results Through Performance Management (2001)

26 Additional resources Book and reports –Measuring Up, Jonathan Walters (1998) –The Reinventor’s Fieldbook, David Osborne and Peter Plastrik, Chapter 7 (2000) –Making Results-Based State Government Work, The Urban Institute (2001) Oregon Progress Board –Technical Assistance –Training –Strategic Planning

27 George Dunford Performance Measure Manager, DAS (503) Jeffrey L. Tryens Executive Director, Progress Board (503) Rita Conrad Senior Policy Analyst, Progress Board (503) DAS/Oregon Progress Board