Presentation is loading. Please wait.

Presentation is loading. Please wait.

Outcome evaluation Basics An Introduction Prepared for OLA Super Conference 2015 By Cindy Poon, Manager of Public Services Cindy Kimber, Coordinator of.

Similar presentations


Presentation on theme: "Outcome evaluation Basics An Introduction Prepared for OLA Super Conference 2015 By Cindy Poon, Manager of Public Services Cindy Kimber, Coordinator of."— Presentation transcript:

1 Outcome evaluation Basics An Introduction Prepared for OLA Super Conference 2015 By Cindy Poon, Manager of Public Services Cindy Kimber, Coordinator of Branch Services Ajax Public Library

2 Agenda Definition of Outcome Evaluation Outcomes vs Outputs Difference between Assessment and Evaluation Benefits of Outcome Evaluation Examples of APL’s Outcome Evaluation Lessons we’ve learned

3 Definition of Outcome evaluation Outcomes Benefits or changes Influenced by a program’s outputs United Way of America defines outcomes as the “ benefits or changes for individuals or populations during or after participating in program activities. They are influenced by a program’s outputs. Outcomes may relate to behaviour, skills, knowledge, attitudes, values, conditions, or other attributes. They are what participants know, think, or can do; or how they behave; or what their condition is, that is different following the program.” Evaluation Determine whether a program has achieved the desired result: Was it successful? What impact did it have?

4 Outcomes Define the expected results or outcomes in advance Include outcomes when developing a program Measurable & Predictable Change (or improvement) Skills, knowledge, attitudes, behaviours, status, life condition Outcome: Effectiveness of results = impact Positive / negative findings or unintended consequences

5 Outputs “ The direct products of program activities and usually are measured in terms of the volume of work accomplish… have little inherent value… important because they are intended to lead to a desired benefit for participants.” United Way of America Outputs = Numbers Books circulated Programs presented Reference questions answered Participants attended the storytime programs Flyers distributed

6 Evaluation not Assessment Assessment Judgment / decision Learning outcome “ An act of judging or deciding the amount, value, quality or important of something,” defined by Cambridge Dictionary online Evaluation Broader concept Program outcomes Program inputs (resources and activities)

7 Benefits of Outcome Evaluation “ From the user in the life of the library to the library in the life of the user " article cited by Rhea Joyce Rubin from the California State Library. Why do we do evaluation? Decision making: effectiveness Length of a session, format, date/ time, etc. Endorsement Use the data (impact) for funding proposal Tell a story: stakeholders Advocacy tool: support a library’s program makes a significant difference; enhance public image Share the impact

8 Develop an Outcome Evaluation Plan Choose or create a program Identify goals with predictable outcomes Limit the objectives Prepare a clear statement Create outcomes Hope to achieve Have impact to the participants Set indicators Design questionnaires Measure the Inputs, Activities, Outcomes and Outputs Was the program a success? What impact did it have? Test the idea with ‘if-then’.

9 Selected Program: Story Stretchers Talking Singing Reading Writing Playing

10 Goals and Objectives Staff Data Inputs - Activities Outputs Outcomes We use surveys to gather information

11 Inputs: resources required for success Human resources Who is doing the program? Fiscal Resources Do we have sufficient resources? Facilities & equipment Program room, projector, room set-up Knowledge base for the program Training, knowledge, skill Involvement of collaborators Volunteers, community partner

12 Activities: different actions to ensure success Planning Clear understanding of goals, sufficient planning time Promotion Marketing plan, communication strategy Spin-off activities Promoting other programs

13 Outputs: Stats Stats gathering Number of participants Circulation of display material Customer satisfaction : rating of 4 or more out of 5 is our benchmark for success. Compare to benchmarks

14 Outcomes: changes in participants or behaviours over length of program Changes in participants Increased attention span, increased participation, knowledge of rhymes Changes in library use Come more often, check out more books, select different types of materials Changes in parent/child interaction Asking child to predict what comes next, defining new words, relate story to child’s real life experiences

15 Questionnaires Dear Parents: The library is conducting an evaluation of our Story Stretchers program. We would appreciate your completing the following questionnaire. We will have a second questionnaire at the end of the session. 1. Have you attended the Ajax Library’s Story Stretchers storytime program in the past? yes no 2. What are your expectations when coming to the Story Stretchers program? (tick all that apply) Develop my child’s love of reading Enjoy quality time with my child Help my child develop literacy skills Help my child get ready for school Provide an opportunity for socialization for my child Meet other parents and develop friendships Other _______________________________________________________________ _____________________________________________________________________ _ 3. Please rate our Story Stretchers storytime program based on your first impressions where 1 is needs improvement and 5 is excellent. Needs Improvement Excellent Storytime Room 1 2 3 4 5 (clean, suitable, welcoming) Storytime Leader 1 2 3 4 5 (trained, enthusiastic, welcoming) Books/Materials Displayed 1 2 3 4 5 (inviting, age appropriate) Comments: Family email (if you wish to receive email notifications from the library):__________________________

16 Q UESTIONNAIRES Surveyed parents may not be the same at the beginning and end of the session (Drop-in Program) Drop off in parents completing survey from 24 to 14 Staff survey – too much detail – staff did not track # books on display/taken out Staff retirement – lost info/new staff not in a position to comment on changes in children Multi-cultural participants – survey only in English and written

17 Data Collection 1.Has your child’s love of reading increased?  yes  no  stayed the same 2.Please rate our storytime where 1 is needs improvement and 5 is excellent. Needs improvement Excellent Stories12345 (variety, age appropriate) Questions 1 and 2 are easy to collate – and translate into report 92% of participants reported… 3. What do you like best about storytime? Harder to collate but can provide vital information; stories

18 What We Learned Validated that the program was working well 70% of parents indicated they wanted to prepare their children for Kindergarten Led to development of Ready, Set, Kindergarten program Learned to focus our future outcome evaluations Clearly define information we are looking for Narrow down data collected Think in terms of outcomes Need to share knowledge

19 Planning to Achieve Outcomes TD Summer Reading Club Set goals and objectives more thoughtfully – think about desired outcomes We want reading skills to maintain or improve over the summer – minutes at reading level better than books below reading level VS

20 Outcomes, Outcomes Everywhere Battle of the Books Comments from students provided by Whitby TL: I felt included. I made new friends. I wasn't the only freak who loves to read. It allowed me to move on from my old school. These were unexpected outcomes

21 And the Gold “I wasn’t a reader before Battle of the Books” Share comments with funder Use outcomes when speaking to community members about the program Strengthen relationship with our partner

22 A DDITIONAL R ESOURCE www.cla-net.org/?81 California Library Association Outcomes Based Summer Reading www.national.unitedway.org/out United Way of America http://www.curling.ca/w2/files/2014/05/The-Monitoring- Process-in-Carver-Policy-Governance.pdf http://www.curling.ca/w2/files/2014/05/The-Monitoring- Process-in-Carver-Policy-Governance.pdf John Carver

23 Summary Choose a program to evaluate Determine the outcomes Use the Logic Model Inputs Activities Outputs Outcome Analyze the data Communicating the result with everyone, stakeholders, funders, etc. Improve, change, expand, scrap

24 Lessons we’ve learned Need a beginning and an end Not the right program to evaluate – not produce a great impact No clear goal defined Data is not measurable No clear statement

25 Words of caution John Carver: “ a crude measure of the right thing beats an elegant measure of the wrong thing.“ Could be lack of experience in identifying and measuring outcomes; S taff cost to analyze the data; Lack of clear goals and objectives; Test with a small program. Need to be honest to ourselves, no matter what the outcomes are. Use the data with open arms and make change according to the results.

26 questions / comment Thank You! Cindy Poon: cindy.poon@ajaxlibrary.cacindy.poon@ajaxlibrary.ca Cindy Kimber: cindy.kimber@ajaxlibrary.cacindy.kimber@ajaxlibrary.ca


Download ppt "Outcome evaluation Basics An Introduction Prepared for OLA Super Conference 2015 By Cindy Poon, Manager of Public Services Cindy Kimber, Coordinator of."

Similar presentations


Ads by Google