OBE: Outcomes-Based Evaluation ALA Midwinter Meeting January 24th, 2014 Presented by Kit Keller.

Slides:



Advertisements
Similar presentations
Evaluating Program Success Cherie McCraw Born to Read Institute November 2001.
Advertisements

©Performance Results, Inc. Developing and Using Logic Models for Project Planning and Evaluation NIH Science Education Projects Annual Conference May 15,
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Goals, Outcomes and Program Evaluation Community Memorial Foundation March 5, 2014.
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Definition: A program logic model is a systematic, visual way to present a program It is a picture of why and how you believe a program will work.
Corry Bregendahl Leopold Center for Sustainable Agriculture Ames, Iowa.
Project Monitoring Evaluation and Assessment
Does It Work? Evaluating Your Program
Return On Investment Integrated Monitoring and Evaluation Framework.
Developing a Logic Model
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Evaluation. Practical Evaluation Michael Quinn Patton.
High Quality Performance Measures What is Performance Measurement? What makes a high quality performance measure? Copyright © 2012 by JBS International,
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Molly Chamberlin, Ph.D. Indiana Youth Institute
Measuring for Success Module Nine Instructions:
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Horizon Middle School June 2013 Balanced Scorecard In a safe, collaborative environment we provide educational opportunities that empower all students.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Process Evaluation Intermediate Injury Prevention Course August 23-26, 2011 Billings, MT.
Washington State Library OBE Retreat October 20, 2006 Matthew Saxton & Eric Meyers.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
The Evaluation Plan.
Too expensive Too complicated Too time consuming.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Angela Caddell Director for Communications, Financial Education & Outreach Services Campus Financial Education.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
You’ve Got What It Takes: Peer Training and Mentoring for Staff Development You’ve Got What It Takes: Peer Training and Mentoring for Staff Development.
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
1 Orientation ALA Midwinter Meeting January 15, 2010.
Logic Models and Theory of Change Models: Defining and Telling Apart
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Creating a S.I.M.P.L.E. & Comprehensive Assessment Plan for Residence Life and Housing Dr. Mike Fulford Emory University Student Affairs Assessment Conference.
OBE: Outcomes-Based Evaluation ALA Midwinter Meeting January 25th, 2013 Presented by Kit Keller.
Institute for Financial Literacy © Three Elements to a Successful Financial Literacy Education Program Leslie E. Linfield, Esq. October 29, 2008.
1 Designing Effective Training Instructor: Paul Clothier An Infopeople Workshop 2004.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
The Logic Model An Outcomes-Based Program Model. What is a Logic Model? “a systematic and visual way to present and share your understanding of the relationships.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Healthy Futures Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Making it Count! Program Evaluation For Youth-Led Initiatives.
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
Using Logic Models to Create Effective Programs
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
Session 2: Developing a Comprehensive M&E Work Plan.
Childhood Neglect: Improving Outcomes for Children Presentation P21 Childhood Neglect: Improving Outcomes for Children Presentation Measuring outcomes.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
ADRCs Do What? Using Logic Models to Document and Improve ADRC Outcomes Glenn M. Landers.
Logic Models How to Integrate Data Collection into your Everyday Work.
Outcomes, Goals, and Objectives
An agency of the Office of the Secretary of Education and the Arts
Using Logic Models in Program Planning and Grant Proposals
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Presentation transcript:

OBE: Outcomes-Based Evaluation ALA Midwinter Meeting January 24th, 2014 Presented by Kit Keller

Workshop goals  Provide an understanding of outcomes-based planning and evaluation so grantees may refine Smart library ® evaluation and marketing plans.  Set the stage for effective program implementation. 2

Part 1 – What is outcomes-based evaluation? Part 2 – What data should I collect? When? Workshop schedule 3

 What are outcomes?  Will I know it when I see it?  How can I track them? What is outcomes-based evaluation? 4

What are outcomes? Outcomes are benefits to people. Outcomes are changes in skills, knowledge, attitude, behavior, condition, or life status. 5

Like what?  Learned how to read a credit report. Knowledge  Learned how to compare mortgage loan offers. Knowledge and Skill  Can balance my checkbook. Skill  Know how to safely pay bills online. Skill  Feel in control of finances. Attitude 6

What else?  Refinanced mortgage based on content of presentation. Behavior  Established a personal savings account. Behavior  Paid off a credit card balance in full. Behavior  Deposited money in a retirement account. Behavior BEST! 7

Behavior changes … …lead to changes in life status. 8

 See if programs really make a difference in the lives of people.  Improve programs.  Improve planning.  Improve accountability.  Ensure best use of funds.  Demonstrate impact.  Satisfy funders. Why measure outcomes? 9

What can OBE do?  Improve program effectiveness.  Demonstrate success.  Facilitate program adjustment.  Inform future project planners. 10

Process and impact  Evaluating the process: allows you to see how efficient your program is.  Evaluating the impact: allows you to know if your program is making a difference. 11

Grant program elements Inputs Resources used by the program Outcomes Measures of participant change Outputs Counts of activities Activities Actions of the program 12

Examples of inputs, activities, services, and outputs Inputs: Resources dedicated to or consumed by the program. Activities: Program actions that are management related. Services: Program actions that directly involve end users. Outputs: Numbers of direct program products Staff, time, computers, facilities, materials, money, consultants, website, software, Internet, instructors Recruiting, coordinating, promoting, purchasing, scheduling, and evaluating activities. Conducting workshops, mentoring, online offerings, following up with customers. Number of participants served; materials developed and used; workshops offered; website usage counts. 13

Outputs and outcomes  Outputs are not outcomes.  Outputs tell us how much we’ve done.  Outcomes tell us how much difference we’ve made.  Outcomes do not replace outputs; they complement them.  Both are important to provide a full picture of a program’s results. 14

Programs and services  Workshops  Classes  Marketing  Collection development  Partnership development 15

Smart ® library program goals Community members will: view the library as a reliable place for unbiased financial and investment information. [Attitude] make increased use of library programs and resources. [Behavior] be more knowledgeable about key financial and investment issues. [Knowledge] 16

Individual project goals – examples  Increase access to financial literacy materials.  Improve reference skills of staff in areas of finance and financial literacy.  Increase investment knowledge of target audience. 17

Project outputs – examples 18 o 250 participants attended 5 workshops o 45 participants attended counseling sessions o 1,200 children participated in kick-off program o Purchased 38 eBooks and 41 audiobooks o In 4 months each new book was borrowed an average of 8.78 times o 2,600 brochures were distributed

Outputs vs. outcomes  Three programs held  Website developed  Print, electronic materials increased 10%  100 PSAs run  Participants know about several types of investment vehicles.  Patrons regularly use investment website to help with personal finance decisions.  Patrons use materials from investment collection to inform decision-making.  Participants report PSA motivated attendance. 19

Outcomes defined Outcome: A target audience condition changed or improved – a change in skills, attitudes, knowledge, behaviors, status, or life condition brought about (partly or wholly) by experiencing a program. 20 Type of OutcomeDefinitionExample Knowledge What someone knows Participants will increase their understanding of credit scores. Skill What someone can do Participants will create a household budget. Attitude What someone feels or thinks about something Workshop attendees are interested in learning more about controlling their own finances. Behavior How someone acts High school students research options for college scholarships and other funding sources. Status Someone’s social or professional condition More high school seniors will apply for scholarships. Life condition Someone’s financial conditionRate of foreclosures will drop as a result of more financially informed community members.

Get ready Smart investing goals at the library level: Increased requests for investment materials Increased visits to library website Increased awareness of library resources Increased staff competencies 21

Sample outcome statements  All reference staff can use key financial and investment resources.  Staff coordinate financial literacy training for patrons.  Patrons know where to find credible, unbiased financial information online. 22

Outcomes categorized  Immediate (short term) Likely to be changes in attitudes, skills, and knowledge Occur during program cycle  Intermediate (medium term) Likely to be changes in behavior or decision making Can occur a few months into program cycle and a few months after program completion  Permanent (long term) Likely to be changes in life status or condition Occurs sometime after program cycle 23

Get ready Smart investing program elements Classes/programs/training/exhibits Staff training Partnerships Collection development and positioning in physical/virtual library Web presence Marketing/outreach 24

Building outcome statements  Focus on audience  Identify the anticipated change  Keep it simple  Check that they are SMART  Staff learn about key financial and investment resources  Staff provide financial literacy training to patrons  Patrons know where to find credible, unbiased financial information online 25

S – M – A – R – T ?  Specific  Measurable  Achievable  Relevant  Time-specific 26

Example… Five members of reference team will increase their knowledge of Morningstar database by 20% after completing three training workshops offered in the spring. 27

How will you know? Five members of reference team will increase their knowledge of Morningstar database by 20% after completing three training workshops offered in the spring.  In order to demonstrate change, you have to establish a starting point. 28

Learn from the best! 29 Most Effective Evaluation Techniques

Challenges to Evaluating Effectiveness 30

“Other”?  Older adults had difficulty with lots of paper. They seemed to complete front pages only or skipped many questions.  Getting those pesky longer term outcomes that you know are out there.  Once the activity is complete, so is their obligation to participate. You can tell a program is effective while it's happening. Are people showing up to each meeting? Are they engaged, asking questions? That's how we know.  Even though we prepare participants to let them know we will follow up after a program to capture outcomes, we don't get good responses on follow up surveys or phone calls. Bringing teens together at the end of a series of program, in an informal focus group, helped to provide us with some really good information. 31

Bringing it home…  Involve staff, early & often.  Embrace the project.  Involve the community, early & often.  Make friends with local media.  Ensure privacy – continually. 32

In their own words… Key components of effective financial literacy instruction & programming: 33 Knowledge Objectivity Privacy Empathy

Hands-on  During the break: Write a goal for your target audience.  Identify outcomes, indicators and potential data sources for at least one outcome. 34

Workshop schedule Part 2 –What data should I collect? When? 35

Choose the outcomes you want to measure Smart investing outcomes at the library level: Users demonstrate increased skills and/or knowledge Users take action with new skills (i.e. start investing, reduce debt) Program partners report a positive experience working with the library Users participate in programs as a result of PR/marketing activities 36

How do you know? OutcomesIndicatorsSource/Method Participants know how to use financial databases (knowledge/skill) Participants establish regular savings activities (behavior) Data-usage statistics show 20% increase in four months Increased number of participants with savings accounts Establish baseline data use; measure use over four months following training Pre/post-survey responses 37

Staff example  GOAL: Increase staff competency in providing financial/investor education information Outcome: Reference staff are comfortable providing financial/investor information. Outcome: Reference staff know sources of accurate and unbiased information on investing. 38

Identify indicators for your outcomes  Indicators: measurable conditions or behaviors that show an outcome was achieved: What you hoped or intended to see or know Observable evidence of accomplishment, changes, gains  For each outcome generate a list of possible indicators and then narrow to at most three that best show the outcome was achieved 39

Specify indicators for your outcomes  Be SMART with indicators  Use the formula: Number and/or percent of a specific target population who report, demonstrate, exhibit an attitude, skill, knowledge, behavior, status, or life condition in a specified quantity in a specified timeframe and/or circumstance  Examples: Outcome: Reference staff will know sources of accurate and unbiased information on investing  25 or 50% of staff will be able to name 3 online resources that provide financial/investor education information after attending a training Outcome: Users take action to better their personal finance  The # and % of users who report they made one or more life-style changes from a list of 10 key personal finance factors in the last six months 40

41 Prepare to collect data on your indicators  Data Sources: Tools, documents, and locations for information showing what happened to target audience.  Data source options: Feedback forms/short surveys Point-of-use inquiry by staff Focus groups Interviews Skills assessments Observation Instructor assessments Library use statistics

Prepare to collect data on your indicators  Pre-survey and Post-survey You can’t measure success without a baseline. What is the “current state of affairs” — what do people know, perceive and do before the program… and how does the program move the audience forward? Include all stakeholders.  Retrospective surveying Post-observation interview to clarify learning that occurred during project activities. 42

Prepare to collect data on your indicators  Other considerations: When will you collect data? How often will I collect data? Include all participants or a sample? Who will collect data? Who will record/compile data? How will confidentiality be protected? How will participants be informed about the data collection process? 43

Test your measurement system  Pilot or “beta test” your surveys or questionnaires. Clarity of questions Ease of use Are you measuring what you intended to measure? Are you asking the most appropriate questions? REVISE instrument as needed! 44

Analyze and report findings o Review feedback from participants o Collect, input data at regular intervals o Get familiar with the data o Look for and note oddities in reporting o Peruse the data and identify patterns o Substantiate patterns — do data sources corroborate each other? 45

Analyze and report findings  Organize data logically (tables/charts)  Analyze and interpret data to develop narrative for final report  Document findings  Maintain files or database of outcomes and activities  Determine outcomes you want to continue monitoring 46

Use your findings  Tell your story!  Marketing  Accountability and long-term assessment  Improved services and/or programs  Resource (re)allocation  Include data and anecdotes 47

Additional resources IMLS Shaping Outcomes United Way 48

Questions? Kit Keller