NASCSP March 1, 2012. 2 Guidance About Evidence Success should be judged by results, and data is a powerful tool to determine results. We cant ignore.

Slides:



Advertisements
Similar presentations
Understanding Student Learning Objectives (S.L.O.s)
Advertisements

Chapter 3: Clinical Decision-Making for Massage
Educational Consultant
Chapter 5 Transfer of Training
Science Subject Leader Training
Evidence 2/8/2014 Evidence 1 Evidence What is it? Where to find it?
Overview of Performance Measurement. Learning Objectives By the end of the module, you will be able to: Describe what performance measurement is, and.
High Quality Performance Measures What is Performance Measurement? What makes a high quality performance measure? Copyright © 2012 by JBS International,
2 Session Objectives Increase participant understanding of effective financial monitoring based upon risk assessments of sub-grantees Increase participant.
Note: more information is available at:
Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management.
1 Assessing Health Needs Gilbert Burnham, MD, PhD Johns Hopkins University.
Objectives To introduce software project management and to describe its distinctive characteristics To discuss project planning and the planning process.
The Implementation Structure DG AGRI, October 2005
The Managing Authority –Keystone of the Control System
1 Welcome to this Seminar for Board Members. 2 Board Members and Staff – Sharing of Responsibilities Voluntary Organisations and Volunteers Responsibilities.
Illinois Department of Children and Family Services, Pathways to Strengthening and Supporting Families Program April 15, 2010 Division of Service Support,
Exit a Customer Chapter 8. Exit a Customer 8-2 Objectives Perform exit summary process consisting of the following steps: Review service records Close.
Program Goals, Objectives and Performance Indicators A guide for grant and program development 3/2/2014 | Illinois Criminal Justice Information Authority.
BUILDING THE CAPACITY TO ACHIEVE HEALTH & LEARNING OUTCOMES
Board of Early Education and Care Retreat June 30,
Leading for High Performance. PKR, Inc., for Cedar Rapids 10/04 2 Everythings Up-to-Date in Cedar Rapids! Working at classroom, building, and district.
Projects in Computing and Information Systems A Student’s Guide
Introduction to Auditing
QA & QI And Accreditation.  A continuous process to review, critique, and implement measurable positive change in public health policies, programs or.
Using outcomes data for program improvement Kathy Hebbeler and Cornelia Taylor Early Childhood Outcome Center, SRI International.
APS Teacher Evaluation
Privacy Impact Assessment Future Directions TRICARE Management Activity HEALTH AFFAIRS 2009 Data Protection Seminar TMA Privacy Office.
REVIEW: Arthropod ID. 1. Name the subphylum. 2. Name the subphylum. 3. Name the order.
Transition from the Long Shutdown to Hot Checkout: Pre-Hot Checkout Steve Suhring Operability Manager 6/6/13.
Time Management F OR A S MALL B USINESS. TIMEMANAGEMENT 2 Welcome 1. Agenda 2. Ground Rules 3. Introductions.
Presenter: Beresford Riley, Government of
EMS Checklist (ISO model)
Fact-finding Techniques Transparencies
1 Quality Indicators for Device Demonstrations April 21, 2009 Lisa Kosh Diana Carl.
Customer Service.
1. 2 August Recommendation 9.1 of the Strategic Information Technology Advisory Committee (SITAC) report initiated the effort to create an Administrative.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
1 Capacity Training New Mexico Strategic Prevention Framework.
The Office Procedures and Technology
VOORBLAD.
Checking & Corrective Action
Developing and Implementing a Monitoring & Evaluation Plan
Promoting Regulatory Excellence Self Assessment & Physiotherapy: the Ontario Model Jan Robinson, Registrar & CEO, College of Physiotherapists of Ontario.
© 2012 National Heart Foundation of Australia. Slide 2.
1 Knowledge Transfer Concepts Presented by the Division of Personnel State of Alaska.
Science as a Process Chapter 1 Section 2.
Model and Relationships 6 M 1 M M M M M M M M M M M M M M M M
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
RTI Implementer Webinar Series: Establishing a Screening Process
Controlling as a Management Function
12 Financial Management 12-1 Financial Planning
©Brooks/Cole, 2001 Chapter 12 Derived Types-- Enumerated, Structure and Union.
1 Phase III: Planning Action Developing Improvement Plans.
PSSA Preparation.
Statistically-Based Quality Improvement
Implementing Strategy in Companies That Compete in a Single Industry
Educator Evaluation: A Protocol for Developing S.M.A.R.T. Goal Statements.
Chapter 14 Fraud Risk Assessment.
1 What Counts: Measuring the Benefits of Early Intervention in Hawai’i Beppie Shapiro Teresa Vast Center for Disability Studies University of Hawai`i With.
Data, Now What? Skills for Analyzing and Interpreting Data
Benchmarking as a management tool for continuous improvement in public services u Presentation to Ministry of Culture of the Russian Federation u Peter.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Unit 10. Monitoring and evaluation
CSBG ROMA NEXT GENERATION
Presentation transcript:

NASCSP March 1, 2012

2 Guidance About Evidence Success should be judged by results, and data is a powerful tool to determine results. We cant ignore facts. We cant ignore data. President Barack Obama July 24, 2009 The test of a performance management system is whether its actually used….Federal managers and employees at all levels must use performance goals and measures to set priorities, monitor progress, and diagnose problems. Chief Performance Officer Jeffrey Zients October 29, 2009

Two key points: Results are supported by data Data from performance measurement systems must be put to use. In recent OCS guidance, several models and references to evidence have been identified. Here is a quick review of these terms. 3

Approaches to prevention or treatment that are validated by some form of documented scientific evidence. This could be findings established through scientific research, such as controlled clinical studies, or other comparable and rigorous methods. Identifies proven effectiveness, supported by objective and comprehensive research and evaluation. 4

Approaches that use the best available research and practice knowledge to guide program design and implementation within context. This informed practice allows for innovation and incorporates the lessons learned from the existing research literature. 5

An innovative and consistently applied policy, process, practice, or procedure that takes a comprehensive approach to developing and implementing activities using strategies that are related to the intended service recipients and community. This practice model is culturally competent, data-driven, measurable, and replicable and incorporates a method for documenting programmatic results. 6

A practice with at least preliminary evidence of effectiveness in small-scale interventions, or for which there is potential for generating data that will be useful for making decisions about taking the intervention to scale the results diverse populations and settings. 7

A program, activity or strategy that has been shown to work effectively and produce successful outcomes This is supported to some degree by subjective and objective data sources 8

What do we know about our measurement tools? Are they Accessible Reliable Valid (appropriate) Affordable Scalable (able to measure change) 9

Bank accounts Copy of Diploma or Certificate Employment records Escrow accounts Financial reports Health or nutrition records Inspection results Lease agreements Legal documents Loan monitoring reports Mortgage documents Observation log Pre-post tests Progress reports Questionnaire Rent receipts Scales and Matrices Survey Testing results 10

Agency database Case notes Centralized database Computer spreadsheets File cabinets Individual case records Manual tallies Public database School records Specialized database Tax Assessor database Training center Work plan reports 11

If no one is assigned the task, no one will do it. Consider various agency staff responsibilities Include partner reports 12

Home visits Office appointments Phone follow up Review of case notes/progress reports (scales?) 13

Daily Weekly Monthly Quarterly Biannually Annually Upon incident 14

15

16 Data about community needs and resources This can identify the scope of community issues Used to prioritize community needs for agency intervention Data used during strategic planning Identify resources ($, facilities, staff, etc) the agency has/needs What results does the agency expect to achieve? Data collected during implementation of services How many individuals, families, communities are projected to be served? How many are projected to achieve results? How many individuals, families, communities were actually served? Who were they? Data collected to identify achievement of results How many made movement toward their goals? How many achieved the end results previously identified?

17 Evaluation Data Used in comparing CAA performance with a control group to determine the quality of the results Done by an independent reviewer Used to validate that results actually impact on identified community needs

18 Source: Child Trends

The collection of NPI data for a report to OCS is a comprehensive effort by the CSBG Network to identify the results of the efforts of local CAAs. This is sometimes called the ROMA report, but we have just seen that ROMA data includes more than these indicators. 19

Measure and report performance Identify patterns and relationships Consider new actions based on analysis Create new outcome-focused goals Develop new resources 20

When you have your data, remember that you have to turn them into information. They must be interpreted to become evidence Consider this piece of data and the questions that arise: 35 individuals got a job. Is that good? What is the unemployment rate? Opportunity for employment? Characteristics of those who got jobs. Etc. 21

What are the top 10 services provided by your agency? What are the top 10 outcomes achieved by your customers? What is the connection between these two lists? 22

Build on ROMA principles to identify ways to collect credible evidence in systematic ways (making the connections among mission, community need, agency strategies and well documented results) Find ways to compare CAA performance with established standards and performance of other similar programs. 23

We must recognize the value of evidence-based or evidence-informed interventions…. but our definition of what counts as credible evidence should be expanded to allow for continuing improvement and innovation. Evidence-based does not have to mean experimental-based. We draw on evidence from many kinds of research, including program evaluations, and practice. From Lisbeth Schorr 24

Lisbeth Schorr tells us to identify the clear, measurable results sought by a complex intervention as the essential first step toward both successful implementation and to a successful evaluation. We have a start on this with the NPIs. 25

It is essential to have some way of comparing results to establish that the observed change has a high probability of resulting from the practices. How you will compare results from your programs to those who have not been involved in your programs. The community-specific nature of place-based interventions makes it very hard to find a comparison group that would allow for a clinical control group. 26

It is possible to compare outcomes among the populations those served by a specific initiative to (a) similar populations in the geographic area before the intervention began for whom baseline data are available, (b) current populations that did not receive similar services and supports, but for whom data are already available, i.e. does not have to be collected as part of the evaluation, and (c) national, state or local norms. 27

The elements of ROMA principles and practices can help agencies establish evaluation frameworks that include the evidence that will be used to support their results Establish CSBG industry standards Remember what baseball did that we havent yet done: 28

Clearly identified the indicator to be measured Collect data consistently Analyze the data against records of wins Publish the data and the analysis, so the public recognizes success 29

Support knowledge collection, analyses, and evidence syntheses that yield a more complete body of evidence. Develop network tools and capacities to gather knowledge at greater scale. Expand the menu of available evaluative techniques that can be matched to different types of interventions. Combine findings from research, theory, and practice, for informed decision-making Promote use of a results framework to strengthen measurement for accountability and learning 30

Real-time learning can be achieved by using a well developed results framework to track progress toward those results, and using the data for real-time learning. 31

As Peter Drucker has pointed out, The greatest danger in times of turbulence is not the turbulence; it is to act with yesterdays logic. We do not want to rely on fads, hunches, anecdotes, or good intentions. Nor are we reluctant to identify and end support for the efforts that are ineffective. 32

Barbara Mooney National ROMA Training Project 243 E. High St. Waynesburg, PA