Beyond Compliance: Informative Reporting 2008 AmeriCorps Conference.

Slides:



Advertisements
Similar presentations
1 Instruments and Data Collection New Mexico AmeriCorps April 20, 2006 Sue Hyatt, Project STAR Coach.
Advertisements

Performance Measurement: Defining Results Tutorial #1 for AmeriCorps
The Project Plan and Performance Measures Tutorial #3 for AmeriCorps*VISTA.
Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management.
Results Based Monitoring (RBM)
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Screen 1 of 20 Reporting Food Security Information Reporting for Results Learning Objectives At the end of this lesson you will be able to: understand.
I skate to where the puck is going to be, not to where it has been. —Wayne Gretzky.
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
Student Growth Measures in Teacher Evaluation
Purpose of Evaluation  Make decisions concerning continuing employment, assignment and advancement  Improve services for students  Appraise the educator’s.
OJJDP Performance Measurement Training 1 Incorporating Performance Measurement in the Formula Grant RFP and Application Format Presenter: Pat Cervera,
Office of Accountability, Assessment and Intervention 1 Getting Ready for SIP: Developing the Action Sequences FALL 2006.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
COLLEGE SPARK WASHINGTON 2012 Community Grants Program Application Webinar 12/22/201110:00 AM 1/4/20122:00 PM.
CA Performance Measurement Worksheet (PMW)
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Improve your Grant Proposal Assessment, revisions, etc. Thomas S. Buchanan.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
How to Develop the Right Research Questions for Program Evaluation
Putting the Pieces Together…. Understanding SLOs.
2014 AmeriCorps External Reviewer Training
Evidence based research in education Cathy Gunn University of Auckland.
Student Learning Objectives (SLOs) Measuring Teacher Effectiveness Through the Use of Student Data SLO Process – Step 4 Monitoring Progress and Refining.
Step 6: Implementing Change. Implementing Change Our Roadmap.
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Performance Measurement and Analysis for Health Organizations
Training of Process Facilitators Training of Process Facilitators.
Performance Measurement Overview 9/18/2015 Performance Measurement Overview 1 What is it? Why do it?
MAPA Task Group The Individual Leadership Plan (ILP) (1/26 th of the MAPA)
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Why principal evaluation? Because Leadership Matters!
CHOLLA HIGH MAGNET SCHOOL Plc Workshop
GOOD INTENTIONS : USING DATA TO IMPROVE PERFORMANCE 1 Prepared & Presented By: Renata Cobbs Fletcher Consultant, M.H. West & Co. Persistently Dangerous.
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Action Planning Webinar September 12 th Your speaker today Matt Roddan Director, Employee Research ORC International.
Effective Decision Making – Part 2. Effective Decision Making Key components: Service Standards (Part 1) Transit Improvement Program (TIPs) Three Year.
2013 NEO Program Monitoring & Evaluation Framework.
Indiana Academic Standards District Improvement Plan School Improvement Plan GPS* Professional Development Reflection/Revision Data/Evidence.
Past Performance AmeriCorps State and National External Review Daniel Barutta and Sarah Yue, Program Officers.
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
Copyright © 2014 by The University of Kansas Refining the Program Intervention Based on Research.
Why Do State and Federal Programs Require a Needs Assessment?
Using Data Effectively ABE Directors’ Meeting October 9th and 10th, 2002.
Presented by: Jan Stanley, State Title I Director Office of Assessment and Accountability June 10, 2008 Monitoring For Results.
OPPORTUNITIES AND CHALLENGES OF CONDUCTING RESEARCH IN LARGE SCALE PROGRAMS Presented by: Deanna Olney and Jef Leroy, IFPRI.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Sustainability Planning Framework and Process Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 ©
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
Office of Service Quality
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
The New Face of Assessment in the Common Core Wake County Public Schools Common Core Summer Institute August 6, 2013.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
A Framework for Evaluating Coalitions Engaged in Collaboration ADRC National Meeting October 2, 2008 Glenn M. Landers.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
NIH R03 Program Review Ning Jackie Zhang, MD, PhD, MPH College of Health and Public Affairs 04/17/2013.
Building Evidence of Effectiveness
Office of Planning & Development
The ACT and Pre-ACT Tests
Courtney Mills Principal, Midlands Middle College
Dr. Lani (Chi Chi) Zimmerman, UNMC Dr. Bill Mahoney, IS&T
IENE – INTERCULTURAL EDUCATION OF NURSES AND MEDICAL STAFF IN EUROPE
Group work on challenges Learning oriented M&E System
Presentation transcript:

Beyond Compliance: Informative Reporting 2008 AmeriCorps Conference

Objectives  Find Out How You’re Doing  Identify reporting challenges  Discuss informative reporting  Review sample report

Why Do Reporting?  Track progress to achieve your results  Identify and address problems  Determine need to make changes  Determine effective activities & processes before committing resources to sustain

How do you use your reports?

What happens from the very beginning of a project determines its eventual impact, long before a final report is issued

Program Stages and Reporting Progress 1 st Period 90 days from Start Date Initial Review 2 nd Period 180 days from Start Date Implementation Review 3 rd Period Implementation Review 4 th Period Results Review Address if components are in place as anticipated/designed for each PMW. Review progress on what needs to be in place for program design to work. Review and report progress or status as appropriate on: Report addressing if components are functioning as anticipated/designed for each PMW. Review progress on each PMW to confirm activities and processes are being implemented as designed. You should report: Address results [outputs & outcomes]. Review progress on PMWs to confirm activities & processes implemented as designed. At this point in the program, there should be evidence of outcomes. You should report: Report addressing final results [outputs and outcomes] of each PMW and the overall program. At this point in the program there should be final outcome data analysis. You should report: a)Pre-service data prep [appropriate instrument(s) in place, or access to existing data]. b)Process to select high need target population is in place. c)Performance Measurement Plan (WBRS) for each instrument complete. d)Status of PM Plan --how & when pre-service data will be/has been collected, analyzed, & reported. e)Report on pre-service data analysis [e.g., # assessed, # in high need group, etc]; or date available. f)Changes: staff, member type, member/staff/site training, dosage, targeted beneficiaries, partner commitments, sites, supervision,… g)Analysis of the initial review process to determine readiness to meet results [outputs, outcomes] h)Training & technical assistance. i)Other issues of concern. a)PM Plan data collection & analysis during this period [pre, mid and/or post]. b)Status & projection on output targets. c)Status of established benchmarks for outcome targets. d)Members’ activities—have they changed, are they on course? e)Status of member training—has it changed, is it on course?. f)Challenges or barriers that effected or may effect ability to meet targets—changes in: staffing, member configuration, training [member/staff/site], dosage, targeted beneficiaries, partner commitments, sites, supervision, etc. g)Plan of action for identified challenges or barriers. h)Training & technical assistance needed. The items in 2 nd Quarter plus: a)Successes and achievements within the targeted areas. b)Program or intervention adjustments based on analysis of pre, mid and/or post data. c)Accuracy of instruments and data. d)Member performance. e)Training & technical assistance needed a.Final result of the targets as stated in the PMW. Results should address the exact target. b.Outcome targets include percent of people or things that changed, what changed and the amount of change c.Highlights of program partner meeting to discuss program outcomes and continuous improvement strategies. d.Plan of action for identified challenges or barriers. e.Training & technical assistance needed

Are components in place and started? Talk about: Selection, training, placement for members, staff, & site. Amount & findings of pre-service data [& outcomes if known]. Beneficiary selection. Member enrollment. Partners. Problems to address. Is it starting as planned? Initial Review- 1st Period

Are things being implemented as designed? Talk about progress/changes in: activities, data collection & analysis [pre, mid, post], outcome benchmarks, staffing, member configuration, beneficiaries, training [member/staff/site], sites & partner commitments. Outcomes should be reported in this stage. Your plan for challenges or barriers. Implementation Review 2 nd & 3 rd period

How did it all turn out? Report final results [outputs & outcomes] and overall program findings. Address the exact target. Talk about partner meeting to discuss outcomes & continuous improvement strategies. Results Review 4 th period

How’s reporting going for your program?

Typical Reporting Problems 1.Faulty Logic on PMW  Unclear/unfocused results  Activity doesn’t produce result  Targets not specific [amount of change]  Measures inappropriate

Typical Reporting Problems 2.Data Issues  Inadequate data collected No baseline data—or too late No data collected or not enough  Inappropriate Instrument  Data doesn’t address Target

Typical Reporting Problems 3.What’s Actually Reported  Processes not assessed to avoid problems  Data not analyzed &/or summarized No discussion/breakout of data [pre] Data dump—too much to decipher  Results/findings not summarized Not concise, unfocused—no analysis

What Makes a Strong Report?  Conveys the status of the program  Reflects program planning and refining  Based on a strong PMW--logic model  Clear & focused on relevant information  Paints a picture—says something  Reflects a plan for data collection & reporting

 Due dates and timelines  Adequate and sufficient data collection  Timely data collection  Who will aggregate & analyze data & when  You have a Plan! Data collection and analysis has to be seen as important to program operation A Data Plan Establishes:

highneedtargetgroup  Identify a high need target group. Who really benefits from your service?  How much service is needed to achieve the results? What’s the dosage?  Importance of periodic assessment  Importance of being curious about the impact of your services  Develop a Data Plan Areas of Concern

Reporting Challenges to Avoid 1.Losing the purpose of the report. It’s not just a funding requirement 2.Uninformative reporting about the process 3.Report doesn’t really reflect the program 4.Weak/no findings about results 5.Weak information about outcomes

Small Group 1.Review the sample report 2.Compare your report to the sample 3.Discuss how you might improve reporting  Do you have a data plan?  Are reports informative to you?  Do you report on program promises? 4.Be prepared to report to large group

Assessment Sayings  What gets measured gets done.  If you can demonstrate results, you can win public support  If you can’t recognize what’s not working, you can’t correct it. SOURCE: Osborne & Gaebler: Results Oriented Government, 1992.

Tips  Don’t delay analysis of pre-service data actualanticipated  Compare actual to anticipated  Discuss comparison. What does it mean? you  Interpret results so you know what to do  Be sure you can really get the data  Results should not be a surprise.  Say something!