Outcome evaluation Basics An Introduction Prepared for OLA Super Conference 2015 By Cindy Poon, Manager of Public Services Cindy Kimber, Coordinator of.

Slides:



Advertisements
Similar presentations
A Vehicle to Promote Student Learning
Advertisements

Developmentally Appropriate Practice
Kathy Keeley Northland Foundation Strengthening Communities April 14, 2011 Writing Result Measures and Program Evaluation.
Best Start Conference January Peel Health Great Beginnings Initiative  In 1999, McCain and Mustard’s Early Years Study documented the importance.
SRP: using data to tell your story & evaluate your program Joyce Chapman, Consultant for Communications & Data Analysis, State Library Webinar * May 24,
Project Monitoring Evaluation and Assessment
Oklahoma Home & Community Education 2009 District Leader Lesson Developed by Sandy Lackey, Carter County & Susan Routh, Grady County Family and Consumer.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Talbert House Project PASS Goals and Outcomes.
Evaluation. Practical Evaluation Michael Quinn Patton.
Every day 83 million people attend 11.5 million meetings.
Quality of Life ENACTUS TRAINING Measurement Tools Developed by D Caspersz & D Bejr, 2013.
FOR YOUTH DEVELOPMENT FOR HEALTHY LIVING FOR SOCIAL RESPONSIBILITY PUTTING SUCCESS INTO WORDS Y Readers Charlotte, NC | Y READERS | ©2012 YMCA OF GREATER.
LBC Online Survey Staff Training Session. Aim of the session To ensure library staff are: well informed about the Library Survey and their role in its.
Presented by Margaret Shandorf
APP Middle School’s Project Kirklees. The Vision Every child knows how they are doing and what they need to do to improve and how to get there. They get.
Components of Quality Program Assessment Tools.  “Inclusion has legal status in legislation mandating educational services for all children with disabilities.
Molly Chamberlin, Ph.D. Indiana Youth Institute
 PARENTS–TEACHERS INTEVIEW  Introduction: Aim of a Parent Teacher Interview Make some final notes Diligently do what you said you would do Keep communicating.
The Department of Federal and State Programs Presenter: Margaret Shandorf.
Chapter 3 Needs Assessment
Understanding The Early Years Niagara College ECE Program  October 2007 Glory Ressler, B.A., Dip. GIT Coordinator, Understanding the Early Years Niagara.
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Options for researching the library’s impact on early literacy skill development Research designs for investigating cause and effect relationships: Experimental.
Assessment of Staff Needs and Satisfaction in an Evergreen Environment David Green NC Cardinal Consultant State Library of North Carolina.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Involving the Whole Organization in Creating or Restructuring a Volunteer Program Louise DeIasi DeCava Consulting.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
English and Literacy at Hope Valley College Assessment and setting in English Accelerated Reader Reading Buddies Homework Please ensure you have collected.
Bilingual Library Services Providing Spanish language services cuando no se habla español.
Impact assessment framework
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
The School-Age Training Project CalSAC: Enriching children by empowering professionals for over 30 years. The CalSAC Trainer Network Deepening.
Our Community: THINGS ARE JUST NOT THE SAME!. UNIT SUMMARY: Children are often under the impression that the way things are in their world is the way.
Program Evaluation and Logic Models
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Military Family Services Program Participant Survey Training Presentation.
Outcome Based Evaluation for Digital Library Projects and Services
Quincy School District “Ready to ROAR" Parent, Family and Community Engagement Program.
Logic Models and Theory of Change Models: Defining and Telling Apart
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Strategies for Grappling with a Changing Business Environment and Achieving Excellence Kathy Koehler Koehler Partners September 19, 2012.
Evaluation of the Incredible Years SCHOOL READINESS Parenting Programme in North Wales 25 th January 2013 Kirstie Pye, PhD Student.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Creating a S.I.M.P.L.E. & Comprehensive Assessment Plan for Residence Life and Housing Dr. Mike Fulford Emory University Student Affairs Assessment Conference.
704: Conducting Business in Fiscally Challenging Times: Strategies and Tools to Get There PCYA Leadership Academy Presentation March 28, 2012.
Take Charge of Change MASBO Strategic Roadmap Update November 15th, 2013.
Evaluation Workshop Self evaluation – some workable ideas and approaches.
Family Resource and Youth Services Centers: Action Component Plan.
Presenters: Pauline Mingram & KG Ouye July 25, 2011 California State Library Public Access Technology Benchmarks Webinar.
Our Community: THINGS ARE JUST NOT THE SAME!. UNIT SUMMARY: Children are often under the impression that the way things are in their world is the way.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Facilitate Group Learning
Parent Satisfaction Surveys What is the Parent Satisfaction Survey?  Each year schools from our district are selected to participate in the.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Outcomes, Value and Impact Metrics for Library Success Sept th Summary of notes Presented by Phillippa Brown, Planning Coordinator.
Traffic lights show news that the school thinks is GOOD, REQUIRING IMPROVEMENT, or INADEQUATE. The Palmer Academy Self-Evaluation for Parents & Carers.
Final Presentation, European Cooperative House Brussels, 16 Dec.2009 Training +45 “Teachers Facilitating Learning among Elders” WP5 Test and Evaluation.
 Prepare  Study the agenda  Study the minutes  Prepare for your contributions  Prepare to play a major role  List questions.
PRESENTER: MS. CRYSTAL WATSON DATE: OCTOBER 4, 2014 Preparing for a Successful Job Interview.
Welcome to the Zone-Level District Trainer Program Helping you to plan and conduct training meetings that support effective Rotary clubs.
Getting Ready for Kindergarten Everett Public Schools 2016.
Keeping the main thing the main thing Joanne Murray Kiwanis Club of Brantford, Ontario.
ADRCs Do What? Using Logic Models to Document and Improve ADRC Outcomes Glenn M. Landers.
Reading By Design | Summer 2017
How to Measure and Share your Success: Using Visit Tracker to Tell Your Story April 6, 2017 | Rachel Breck.
What is Planning? Start at 9:15—10 minutes to do this. Finish at 9:25.
Outcomes Based Library Planning – Part 2
Presentation transcript:

Outcome evaluation Basics An Introduction Prepared for OLA Super Conference 2015 By Cindy Poon, Manager of Public Services Cindy Kimber, Coordinator of Branch Services Ajax Public Library

Agenda Definition of Outcome Evaluation Outcomes vs Outputs Difference between Assessment and Evaluation Benefits of Outcome Evaluation Examples of APL’s Outcome Evaluation Lessons we’ve learned

Definition of Outcome evaluation Outcomes Benefits or changes Influenced by a program’s outputs United Way of America defines outcomes as the “ benefits or changes for individuals or populations during or after participating in program activities. They are influenced by a program’s outputs. Outcomes may relate to behaviour, skills, knowledge, attitudes, values, conditions, or other attributes. They are what participants know, think, or can do; or how they behave; or what their condition is, that is different following the program.” Evaluation Determine whether a program has achieved the desired result: Was it successful? What impact did it have?

Outcomes Define the expected results or outcomes in advance Include outcomes when developing a program Measurable & Predictable Change (or improvement) Skills, knowledge, attitudes, behaviours, status, life condition Outcome: Effectiveness of results = impact Positive / negative findings or unintended consequences

Outputs “ The direct products of program activities and usually are measured in terms of the volume of work accomplish… have little inherent value… important because they are intended to lead to a desired benefit for participants.” United Way of America Outputs = Numbers Books circulated Programs presented Reference questions answered Participants attended the storytime programs Flyers distributed

Evaluation not Assessment Assessment Judgment / decision Learning outcome “ An act of judging or deciding the amount, value, quality or important of something,” defined by Cambridge Dictionary online Evaluation Broader concept Program outcomes Program inputs (resources and activities)

Benefits of Outcome Evaluation “ From the user in the life of the library to the library in the life of the user " article cited by Rhea Joyce Rubin from the California State Library. Why do we do evaluation? Decision making: effectiveness Length of a session, format, date/ time, etc. Endorsement Use the data (impact) for funding proposal Tell a story: stakeholders Advocacy tool: support a library’s program makes a significant difference; enhance public image Share the impact

Develop an Outcome Evaluation Plan Choose or create a program Identify goals with predictable outcomes Limit the objectives Prepare a clear statement Create outcomes Hope to achieve Have impact to the participants Set indicators Design questionnaires Measure the Inputs, Activities, Outcomes and Outputs Was the program a success? What impact did it have? Test the idea with ‘if-then’.

Selected Program: Story Stretchers Talking Singing Reading Writing Playing

Goals and Objectives Staff Data Inputs - Activities Outputs Outcomes We use surveys to gather information

Inputs: resources required for success Human resources Who is doing the program? Fiscal Resources Do we have sufficient resources? Facilities & equipment Program room, projector, room set-up Knowledge base for the program Training, knowledge, skill Involvement of collaborators Volunteers, community partner

Activities: different actions to ensure success Planning Clear understanding of goals, sufficient planning time Promotion Marketing plan, communication strategy Spin-off activities Promoting other programs

Outputs: Stats Stats gathering Number of participants Circulation of display material Customer satisfaction : rating of 4 or more out of 5 is our benchmark for success. Compare to benchmarks

Outcomes: changes in participants or behaviours over length of program Changes in participants Increased attention span, increased participation, knowledge of rhymes Changes in library use Come more often, check out more books, select different types of materials Changes in parent/child interaction Asking child to predict what comes next, defining new words, relate story to child’s real life experiences

Questionnaires Dear Parents: The library is conducting an evaluation of our Story Stretchers program. We would appreciate your completing the following questionnaire. We will have a second questionnaire at the end of the session. 1. Have you attended the Ajax Library’s Story Stretchers storytime program in the past? yes no 2. What are your expectations when coming to the Story Stretchers program? (tick all that apply) Develop my child’s love of reading Enjoy quality time with my child Help my child develop literacy skills Help my child get ready for school Provide an opportunity for socialization for my child Meet other parents and develop friendships Other _______________________________________________________________ _____________________________________________________________________ _ 3. Please rate our Story Stretchers storytime program based on your first impressions where 1 is needs improvement and 5 is excellent. Needs Improvement Excellent Storytime Room (clean, suitable, welcoming) Storytime Leader (trained, enthusiastic, welcoming) Books/Materials Displayed (inviting, age appropriate) Comments: Family (if you wish to receive notifications from the library):__________________________

Q UESTIONNAIRES Surveyed parents may not be the same at the beginning and end of the session (Drop-in Program) Drop off in parents completing survey from 24 to 14 Staff survey – too much detail – staff did not track # books on display/taken out Staff retirement – lost info/new staff not in a position to comment on changes in children Multi-cultural participants – survey only in English and written

Data Collection 1.Has your child’s love of reading increased?  yes  no  stayed the same 2.Please rate our storytime where 1 is needs improvement and 5 is excellent. Needs improvement Excellent Stories12345 (variety, age appropriate) Questions 1 and 2 are easy to collate – and translate into report 92% of participants reported… 3. What do you like best about storytime? Harder to collate but can provide vital information; stories

What We Learned Validated that the program was working well 70% of parents indicated they wanted to prepare their children for Kindergarten Led to development of Ready, Set, Kindergarten program Learned to focus our future outcome evaluations Clearly define information we are looking for Narrow down data collected Think in terms of outcomes Need to share knowledge

Planning to Achieve Outcomes TD Summer Reading Club Set goals and objectives more thoughtfully – think about desired outcomes We want reading skills to maintain or improve over the summer – minutes at reading level better than books below reading level VS

Outcomes, Outcomes Everywhere Battle of the Books Comments from students provided by Whitby TL: I felt included. I made new friends. I wasn't the only freak who loves to read. It allowed me to move on from my old school. These were unexpected outcomes

And the Gold “I wasn’t a reader before Battle of the Books” Share comments with funder Use outcomes when speaking to community members about the program Strengthen relationship with our partner

A DDITIONAL R ESOURCE California Library Association Outcomes Based Summer Reading United Way of America Process-in-Carver-Policy-Governance.pdf Process-in-Carver-Policy-Governance.pdf John Carver

Summary Choose a program to evaluate Determine the outcomes Use the Logic Model Inputs Activities Outputs Outcome Analyze the data Communicating the result with everyone, stakeholders, funders, etc. Improve, change, expand, scrap

Lessons we’ve learned Need a beginning and an end Not the right program to evaluate – not produce a great impact No clear goal defined Data is not measurable No clear statement

Words of caution John Carver: “ a crude measure of the right thing beats an elegant measure of the wrong thing.“ Could be lack of experience in identifying and measuring outcomes; S taff cost to analyze the data; Lack of clear goals and objectives; Test with a small program. Need to be honest to ourselves, no matter what the outcomes are. Use the data with open arms and make change according to the results.

questions / comment Thank You! Cindy Poon: Cindy Kimber: