Building Evidence of Effectiveness

Slides:



Advertisements
Similar presentations
Welcome to Volunteer Management
Advertisements

Application of the PROJECT CYCLE MANAGEMENT in Piedmont Region.
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Alaska Native Education Program (ANEP) Technical Assistance Meeting September 2014 Sylvia E. Lyles Valerie Randall Almita Reed.
Data Collection* Presenter: Octavia Kuransky, MSP.
How to Develop the Right Research Questions for Program Evaluation
Nursing Science and the Foundation of Knowledge
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
WRITING THE SUCCESSFUL PROPOSAL C. June Strickland, Ph.D., RN Associate Professor University of Washington School of Nursing.
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
Big Idea 1: The Practice of Science Description A: Scientific inquiry is a multifaceted activity; the processes of science include the formulation of scientifically.
Beyond Compliance: Informative Reporting 2008 AmeriCorps Conference.
Science Department Draft of Goals, Objectives and Concerns 2010.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
Laying the Groundwork Before Your First Evaluation Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Adrienne DiTommaso, MPA, CNCS Office of.
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Teaching and Learning Cycle and Differentiated Instruction A Perfect Fit Rigor Relevance Quality Learning Environment Differentiation.
A Collaborative Mixed-Method Evaluation of a Multi-Site Student Behavior, School Culture, and Climate Program.
Common Core.  Find your group assignment.  As a group, read over the descriptors for mastery of this standard. (The writing standards apply to more.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
325D: PREPARATION OF LEADERSHIP PERSONNEL Webinar on Project Objective and Performance Measures for the Annual Performance Report for Continuation Funding.
Unit 6: Report Writing. What is a Report? A report is written for a clear purpose and to a particular audience. Specific information and evidence is presented,
Module 9: Transition and Exit Strategy ASEAN Training of Trainers (TOT) on Disaster Recovery.
Strategic planning A Tool to Promote Organizational Effectiveness
Performance Management
An Analysis of D&I Applications
Program Evaluation ED 740 Study Team Project Program Evaluation
NORTH CAROLINA TEACHER EVALUATION INSTRUMENT and PROCESS
CRITICAL CORE: Straight Talk.
Workshop to develop theories of change
Engaging in Community: Civic Reflection in National Service
Eleventh hour evaluation
Building the foundations for innovation
IPSP Outcomes Reporting Framework
SUNY Applied Learning Resolution.
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Measuring Project Performance: Tips and Tools to Showcase Your Results
Title INTRODUCTION/PROBLEM/ BACKGROUND METHODS RESULTS/OUTCOMES
Title INTRODUCTION/PROBLEM/ BACKGROUND METHODS RESULTS/OUTCOMES
2016 AmeriCorps Texas All-Grantee Meeting February 25-26, 2016
Our new quality framework and methodology:
BAND Local Sustainability Fund
Title INTRODUCTION/PROBLEM/ BACKGROUND METHODS RESULTS/OUTCOMES
Faculty Training Toolkit
2016 AmeriCorps Texas All-Grantee Meeting February 25-26, 2016
Chicago Public Schools
Title INTRODUCTION/PROBLEM/ BACKGROUND METHODS RESULTS/OUTCOMES
Accelerating implementation of First Link®
Title INTRODUCTION/PROBLEM/ BACKGROUND METHODS RESULTS/OUTCOMES
Title INTRODUCTION/PROBLEM/ BACKGROUND METHODS RESULTS/OUTCOMES
Title INTRODUCTION/PROBLEM/ BACKGROUND METHODS RESULTS/OUTCOMES
CATHCA National Conference 2018
Presentation to - Management Team Javier Garza, HRM B-02
Knowledge of parenting & child development
Title INTRODUCTION/PROBLEM/ BACKGROUND METHODS RESULTS/OUTCOMES
Standard for Teachers’ Professional Development July 2016
Lecturette 1: Leveraging Change through Strategic Planning
Title INTRODUCTION/PROBLEM/ BACKGROUND METHODS RESULTS/OUTCOMES
Lecturette 1: Leveraging Change through Strategic Planning
Overview of the Kansas Technical Assistance System Network: Using Technical Assistance to Facilitate Implementation of Evidence-based Practices Kerry.
Title INTRODUCTION/PROBLEM/ BACKGROUND METHODS RESULTS/OUTCOMES
Session 2-B Applying for and Implementing a Grant
Student Led Conferences: A Closer Look
Building an Informatics-Savvy Health Department
Title INTRODUCTION/PROBLEM/ BACKGROUND METHODS RESULTS/OUTCOMES
Beyond The Bake Sale Basic Ingredients
Analytical Paper 9 June 2015.
Creating Health Communication Campaign
Developing SMART Professional Development Plans
Presentation transcript:

Building Evidence of Effectiveness AmeriCorps Advantage: CaliforniaVolunteers Grantee Training Conference, July 2017 Building Evidence of Effectiveness Dylan Moore Program Officer, CaliforniaVolunteers AmeriCorps Grantee Training Conference 2017

AmeriCorps Grantee Training Conference 2017 Learning Outcomes Describe and identify your program’s place on the evidence continuum Understand the evaluation requirements as specified by CNCS Develop and apply research questions to becoming a learning organization Understand the value of being a learning organization and how evaluation can strengthen your program AmeriCorps Grantee Training Conference 2017

AmeriCorps Grantee Training Conference 2017 The Continuum Evidence at all stages is valuable, negative findings may help to improve the quality of the program The process is not always linear and may evolve or circulate depending on factors such as staff turn over or community changes AmeriCorps Grantee Training Conference 2017

AmeriCorps Grantee Training Conference 2017 The Continuum No Evidence, Pre-Preliminary, Preliminary, Moderate, Strong A majority of applicants for AmeriCorps grants are under pre-preliminary evidence. 100% of applications backed by moderate and strong evidence have been funded in recent grant competitions both at state and CNCS levels Evidence at all stages is valuable, negative findings may help to improve the quality of the program The process is not always linear and may evolve or circulate depending on factors such as staff turn over or community changes 100% stat  reflects other aspects of the program such as capacity and/or experience AmeriCorps Grantee Training Conference 2017

AmeriCorps Grantee Training Conference 2017 The Continuum Your evidence base places you at a point along the continuum. Knowing where you are on the continuum helps you understand your best evaluation goals. Moving along the continuum allows you to demonstrate the effectiveness of your program design and intervention. Advancing evaluation adds to the richness of your program’s story While the continuum is used by CNCS to score applications, programs should not try to exaggerate their location. Knowing where your program stands and developing an evaluation plan is seen as highly valuable by CV and CNCS regardless of the placement on the continuum. Programs with $500,000 or more in grants are required to work with an external evaluator. Builds understanding of your program’s strengths and weaknesses Informs process and intervention changes Adds to the richness of your program’s story and it’s value to the community AmeriCorps Grantee Training Conference 2017

Foundational Elements Becoming a Learning Organization Program Design and Implementation Strong Data Collection Systems Capacity and Responsibility Evaluation Planning AmeriCorps Grantee Training Conference 2017

Becoming a Learning Organization Has a culture of inquiry Uses data and evidence to inform program growth and change Follows a plan for evaluation Becoming a learning organization is important part of setting up a successful evaluation process. While evaluation in itself will ask specific questions about the program, becoming a learning organization allows you to have a holistic style of inquiry within your program. All that your organization does should either be evidence informed, or be set up in a way that will allow for future evaluations. This involves setting up systems, having conversations with all staff, and keeping your eyes and ears open for valuable opportunities for evaluation and data collection. AmeriCorps Grantee Training Conference 2017

Program Design and Implementation Lean on your logic model Refer to your theory of change Conduct a process evaluation Amend and adjust your program design as needed Logic model: your logic model should provide a clear outline of your programs design and implementation. It should be clear enough that it can be easily communicated and understood by a lay person, to the point where your design would be replicable. You should adhere very closely to your logic model when making changes to program implementation. Your logic model will also help you understand your staff capacity and responsibilities. Theory of change: your theory of change is the basis for your program design, it should also be the basis for your evaluation. TOC’s articulate what change is happening and how. By looking at your theory of change you can determine which part the process you want to develop a research question around. Process evaluation: it’s a good idea to conduct a process evaluation prior to an evaluation on your intervention. This allows you determine where your program’s strengths and weaknesses are and where changes should occur. Program design: don’t be afraid to make adjustments as needed. While you obviously want to maintain the integrity of the program design for which you were awarded a grant, if you need to make adjustments to strengthen the quality and effectiveness of the program design, you should do so. Some changes are more reasonable than others and do your best not to make too many changes too quickly as that would impact your ability to evaluate your process and/or your intervention. AmeriCorps Grantee Training Conference 2017

Data Collection Systems Identifying data Instruments Collection Management Analysis What data are you collecting? What data do you already have? Instruments: As much as possible, use existing, validated, instruments. Poor instruments or instruments that have not been thoroughly vetted and validated are one of the most common and detrimental weaknesses in research and evaluation. It’s incredibly challenging to develop a strong instrument without the time and capacity to thoroughly vet it. Use CNCS resources, external evaluators, professors or other academics, and/or experts in your field to determine which instrument will be best for your data collection. Use your performance measures as a tool when planning the instruments and collection systems. Collection: fidelity to the data collection process is key. Properly train your evaluators to collect the data using the exact same process and model across time, sites, and beneficiaries. Management: don’t do a disservice to your hard work by mismanaging data. Ensure that you collect and store data according to specific time periods. Having a strong system for managing your data will prevent double counting or loss of data and provide ease in reporting data during the evaluation and for progress reports to CV. Analysis: Think about how you will analyze the data prior to beginning an evaluation. Analysis should be considered when developing your research question. If you can’t analyze the data in a way that allows you to answer your research question, it becomes a useless exercise. Exercise caution when analyzing data. Depending on the type of analysis desired, it might be better to rely on an external evaluator. Analysis can sometimes be the most complicated piece of data collection and it is easy to make mistakes which lead to faulty or misleading conclusions. Without a background in research or evaluation, stick to descriptive statistics. As you move further along the continuum, the analysis will become more sophisticated and an external evaluation may become a necessity. AmeriCorps Grantee Training Conference 2017

Staff Capacity and Responsibilities Start from the beginning Staff Training External Experts Assigning staff roles and responsibilities from the beginning is key to a smooth evaluation process. Don’t wait until the evaluation implementation rolls around. Begin developing the skills and knowledge needed at the beginning of the grant. There’s no shame in recognizing that you need to reach outside of your organization for help. There are a number of resources that can be utilized and it’s important to budget for these resources from the beginning. In kind match in the form of consultation services is one way that you can lean on community resources while strengthening your evaluation capacity. AmeriCorps Grantee Training Conference 2017

AmeriCorps Grantee Training Conference 2017 Evaluation Planning Evaluation planning consists of the following elements: Introduction Program background Research questions Evaluation design Data collection Analysis plan Timeline Budget While this format is a formal plan for CNCS, use of this outline can help you develop a well thought out evaluation plan which can be communicated clearly and effectively to relevant partners. Thinking about other elements that we’ve discussed today, especially the logic model, theory of change, and research questions, will help you create a strong evaluation plan. AmeriCorps Grantee Training Conference 2017

AmeriCorps Grantee Training Conference 2017 Q&A Let’s use this time to go over any additional questions you might have as well as to share any valuable insight or experience you want to offer. AmeriCorps Grantee Training Conference 2017