Gathering Evidence of Impact: A Continued Conversation Jan Middendorf Cindy Shuman Office of Educational Innovation and Evaluation.

Slides:



Advertisements
Similar presentations
Program Evaluation How to Effectively Evaluate Your Program Carol Pilcher Department of Entomology Iowa State University.
Advertisements

DPI Town Hall Meeting Welcome all participants and state personal commitment to this initiative and your wish to place the results of the survey in the.
1 EU Strategy for the Baltic Sea Region Evaluation: Setting Outcome Indicators and Targets Seminar: 15 March 2011, La Hulpe Veronica Gaffey Acting Director.
Achieve Benefit from IT Projects. Aim This presentation is prepared to support and give a general overview of the ‘How to Achieve Benefits from IT Projects’
Evaluation What, How and Why Bother?.
Assessment Assessment should be an integral part of a unit of work and should support student learning. Assessment is the process of identifying, gathering.
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
Local Control and Accountability Plan: Performance Based Budgeting California Association of School Business Officials.
Training Objectives.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Dr. Julian Lo Consulting Director ITIL v3 Expert
Rules of the Game Form groups of 6-8 persons The 1 st table to signal may answer - correct answers = +10 points - incorrect answer = -10 points You may.
Return On Investment Integrated Monitoring and Evaluation Framework.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
PPA 502 – Program Evaluation
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Reality Check For Extension Programs Deborah J. Young Associate Director University of Arizona Cooperative Extension.
Evaluation. Practical Evaluation Michael Quinn Patton.
S.M.Israr Training Evaluation By Dr Syed Israr Aga Khan university Karachi, Pakistan.
HOW TO WRITE A GOOD TERMS OF REFERENCE FOR FOR EVALUATION Programme Management Interest Group 19 October 2010 Pinky Mashigo.
Defining & Reporting Our Successes Outputs, Outcomes and Impacts.
University of Wisconsin - Extension, Cooperative Extension, Program Development and Evaluation Unit 4: Focusing the evaluation.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
The Evaluation Plan.
WRITING EFFECTIVE GRANT PROPOSALS With an Eye toward Performance and Evaluation Issues.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
Developing Indicators
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Rwanda MCC Threshold Program CIVIL SOCIETY STRENGTHENING PROJECT Cross-Cutting Advocacy Issues Data Collection Monitoring and Evaluation.
Logic Models and Theory of Change Models: Defining and Telling Apart
Basic Workshop For Reviewers NQAAC Recognize the developmental engagements Ensure that they operate smoothly and effectively” Ensure that all team members.
…and how to give it to them What CEOs Want Patti Phillips, Ph.D.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Developing a logic model Western Region Institute Brian Luckey, University of Idaho Extension 1 © 2008 by the University of Wisconsin System..
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
IPMA Executive Conference Value of IT September 22, 2005.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
Using a Logic Model to Plan and Evaluate Your Technology Leadership Development Program Chad Green, Program Analyst Lynn McNally, Technology Resource Supervisor.
Evaluation of Strategic HRD Chapter 11. Why Evaluate ? The Purpose of Evaluation: Viewpoints & Challenges Evaluation is a core part of what makes us compete.
The Logic Model An Outcomes-Based Program Model. What is a Logic Model? “a systematic and visual way to present and share your understanding of the relationships.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Swedish Rural Network Self assessment Nordic Baltic Rural Network meeting
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Outcome-based Planning and Evaluation Gloria Latimer, Ed.S, Director of Community Programs Jason Vahling, M.P.H., Community Program Specialist.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation What do you want to know? What will you measure?
Tools and techniques to measuring the impact of youth work (Caroline Redpath and Martin Mc Mullan – YouthAction NI)
Continual Service Improvement Methods & Techniques.
Developing Monitoring & Evaluation Frameworks: Process or Product? Anne Markiewicz.
A Comprehensive Framework for Evaluating Learning Effectiveness in the Workplace Presented by Dr Cyril Kirwan.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
AUDIT STAFF TRAINING WORKSHOP 13 TH – 14 TH NOVEMBER 2014, HILTON HOTEL NAIROBI AUDIT PLANNING 1.
Evaluating the Quality and Impact of Community Benefit Programs
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Ross O. Love Oklahoma Cooperative Extension Service
Strategic Prevention Framework - Evaluation
Monitoring and Evaluation of Postharvest Training Projects
Bellringer What do teachers do in a typical day?.
Presentation transcript:

Gathering Evidence of Impact: A Continued Conversation Jan Middendorf Cindy Shuman Office of Educational Innovation and Evaluation

Using a logic model to describe your program INPUTSOUTPUTSOUTCOMES Program investments ActivitiesParticipationShortMediumLong-term What you do, with whom What your clients do as a result Impact Learning Behavior Change Condition Change

Four Levels of Evaluation Level 1: Reaction Level 2: Learning/Skill Building – (short term outcome…ask at end of meeting, lesson, workshop, etc.) Level 3: Transfer…Behavior Change – (medium term outcome… ask sometime after…maybe 6 months later ) Level 4: Results… Impact – (long term outcome) Kirkpatrick Model of Evaluation

Evaluation of Outcomes How will you know what your clientele did as a result of your educational program? What is the evidence that they used the information? Why Evaluate?

4 main purposes of evaluation  Improvement: to improve the program; to enhance quality; to manage more effectively and efficiently. The effort to enhance programs.  Accountability: to assess merit or worth; to assess effects; to assess costs and benefits. The effort to make judgments about the value of a policy or program.  Knowledge development: to gain new insights. The effort to add to the knowledge base about effective practice or to add to policy debate.  Oversight and compliance: to assess the extent to which a program follows rules, regulations, mandates or other formal expectations. Source: Mark, M. M., Henry, G. T., & Julnes, G. (2000). Evaluation: An integrated framework for understanding, guiding and improving policies and programs. San Francisco: Jossey-Bass.

Indicators: Evidence of Achieving Outcomes What would it look like? How would I know it? If I were a visitor, what would I see, hear, read, and/or smell that would tell me this “thing” exists? If the outcome is achieved, how will you know it? What will it look like? What is the evidence?

Indicators - Evidence The information needed to answer your evaluation questions Example: Did participant land owners or managers improve their land management practices? Evidence: # acres or % of acres managed according to guidelines # or quality of conservation plans implemented

Indicators - Evidence The information needed to answer your evaluation questions Example: Did participants increase their ability to achieve financial self-sufficiency? Evidence: #, % who increased financial knowledge, #, & who reduced debt, #,% who established an emergency fund

Have the pets been fed today? How would you know that the animals have been fed? What is the evidence?

Let’s practice…. What is the evidence of… –High blood pressure? –A clean neighborhood? –A popular movie? –A good carpenter? –Learning at the workshop? Would the evidence be different for young people vs. seniors, high- vs. low-income neighborhoods, rural vs. urban residents, or by ethnicity?

Evidence is often expressed as numbers or percentages (number of…, percent of…, ratio of…, incidence of…, proportion of…). However, not all evidence is numbers; qualitative evidence may be important. Remember, "Not everything that counts can be counted."

Work on Evaluation Plans Develop a question or two to understand what clientele did differently as a result of your educational program Develop a plan for how you will gather the evidence… – Get names and phone numbers and call them 6 months later – Ask a third party what they see that is different – Observe differences yourself – Other?

How good is the indicator? Tangible – be able to “touch/know” the information in some way – See (observable) – Read (survey, records, etc.) – Hear (from individuals, others) – Tips: direct, specific, useful, practical, culturally appropriate, adequate, clearly defined

What is an indicator? An indicator is the specific information, or evidence, that represents the phenomenon you are asking about. Indicator of fire = smoke Indicator of academic achievement = grades

How good are your questions? Can the questions be answered given the program? Are the questions truly important? Will the questions provide new insights? Can the questions be answered given your resources and timeline? Have the concerns of key users been included?

Identify key evaluation questions Who wants to know what about this program?

Evaluation Questions Clarify your evaluation questions Make them specific What do you need to know vs. what you would like to know Prioritize Check: Will answers to these questions provide important and useful information?

Components of a program Situation Resources (Inputs) Outputs – Activities – Participants Outcomes – Chain of outcomes from short- to long- term External Factors and Assumptions

What is your purpose for evaluating? We are conducting an evaluation of _____ (program name) because ______ in order to __________. Example: We are conducting an evaluation of the Money Quest Program because we want to know to what extent youth who participate learn and use recommended money management skill in order to report program outcomes to our funder.