Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 I N S T I T U T E of M U S E U M and L I B R A R Y S E R V I C E S Washington, DC, www.imls.gov Knowing What Audiences Learn: Knowing What Audiences.

Similar presentations


Presentation on theme: "1 I N S T I T U T E of M U S E U M and L I B R A R Y S E R V I C E S Washington, DC, www.imls.gov Knowing What Audiences Learn: Knowing What Audiences."— Presentation transcript:

1 1 I N S T I T U T E of M U S E U M and L I B R A R Y S E R V I C E S Washington, DC, www.imls.gov Knowing What Audiences Learn: Knowing What Audiences Learn: Outcomes and Program Planning Outcomes and Program Planning Association of Children’s Association of Children’sMuseums2003

2 2 Overview We will Distinguish Outcome-based Planning and Evaluation (“OBE”) from other kinds Distinguish Outcome-based Planning and Evaluation (“OBE”) from other kinds Talk about choosing outcomes Talk about choosing outcomes Talk about basic elements of a Logic Model (a project or program plan) Talk about basic elements of a Logic Model (a project or program plan) Talk about measuring outcomes Talk about measuring outcomes Review and summarize Review and summarize

3 3 Outcomes are achievements or changes in Skill – Painting, basketball Skill – Painting, basketball Knowledge – Zoology, state capitols Knowledge – Zoology, state capitols Behavior – Visits museums, reads daily Behavior – Visits museums, reads daily Attitude – I like science, I love animals Attitude – I like science, I love animals What are Outcomes?

4 4 Outcomes can be achievements or changes in Status – In school, citizen Status – In school, citizen Life condition – Overweight, healthy Life condition – Overweight, healthy What are Outcomes?

5 5 Ella goes to the Zoo’s summer program and (Immediate)Takes more interest in animals (Immediate)Takes more interest in animals (Medium)Gets better biology grades (Medium)Gets better biology grades (Long term)Becomes a veterinary technician (Long term)Becomes a veterinary technician

6 6 Outcomes Where do they come from? Social Services and United Way Social Services and United Way OMB: (GPRA) 1993 and PART OMB: (GPRA) 1993 and PART Trends in funding Trends in funding The need to focus on audience The need to focus on audience The need to communicate museum value The need to communicate museum value IMLS IMLS

7 7 Forms of Evaluation Formative Evaluation You want to know which brochure works best to bring school tours to the zoo Develop two or three prototypes and test on a sample of your target audience; decide which worked best

8 8 Process Evaluation You want to see how efficiently your conservation education program is run You count the number of participants in program components, examine how the components are delivered, how long they take, and what it costs Forms of Evaluation

9 9 Summative Evaluation You want to know if your exhibit program made visitors aware of how they can protect their environment You run the program, then interview visitors to learn how many might use “green” products more in future

10 10 Forms of Evaluation Impact Evaluation (aggregated outcomes create impact) You want to know if your programs helped increase recycling in your area You find out the rate of recycling, run the program, then see if the recycling rate has increased

11 11 Forms of Evaluation Outcome-Based Program Evaluation You want to know if your conservation education program changes participants’ behaviors You identify audience needs, plan services to provide participant-oriented outcomes, and assess program results on a regular basis

12 12 What can OBE achieve? Increase program effectiveness Increase program effectiveness Provide a logical, focused framework to guide program design Provide a logical, focused framework to guide program design Inform decision-making Inform decision-making Document successes Document successes Communicate program value Communicate program value

13 13 What are its limitations? This is a management tool This is a management tool OBE may suggest cause and effect – it doesn't try to prove it OBE may suggest cause and effect – it doesn't try to prove it OBE settles for contribution, not attribution OBE settles for contribution, not attribution OBE uses some of the same methods as research, but … OBE uses some of the same methods as research, but …

14 14 What are its limitations? OBE is not the same as RESEARCH: Doesn’t try to compare your program Doesn’t try to compare your program with another, similar program with another, similar program Doesn’t try to compare your methods Doesn’t try to compare your methods with different methods to create with different methods to create a similar result a similar result Accepts “good enough” data Accepts “good enough” data

15 15 How to develop an outcome-based program: Example What need do you see? Many kids don’t understand basic science principles; many girls are intimidated by or uninterested in science Many kids don’t understand basic science principles; many girls are intimidated by or uninterested in science Head Start is an opportunity for early science experience, but many teachers have no science education and don’t know how to make it fun Head Start is an opportunity for early science experience, but many teachers have no science education and don’t know how to make it fun Parents in at-risk families often have little science knowledge, limited child-development skills, and limited reading skills Parents in at-risk families often have little science knowledge, limited child-development skills, and limited reading skills

16 16 Who has the need (target audience)? Kids need science experience Kids need science experience Head Start teachers could provide experience if they had knowledge and skills Head Start teachers could provide experience if they had knowledge and skills Families might encourage science learning if we made it easy and fun Families might encourage science learning if we made it easy and fun Who could you work with most easily and effectively? Head Start teachers How to develop an outcome-based program: Example

17 17 What could your museum do? Provide science education resources for Head Start Teachers – learning kits and training to use them Provide science education resources for Head Start Teachers – learning kits and training to use them Provide take-home science kits Provide take-home science kits Partner with public libraries to provide age- and education-appropriate books on science subjects Partner with public libraries to provide age- and education-appropriate books on science subjects MESS – Marvelous Explorations through Science and Stories! MESS – Marvelous Explorations through Science and Stories! How to develop an outcome-based program: Example

18 18 Planning for Outcomes Articulates process and outcomes of your program Articulates process and outcomes of your program Clarifies each element of your program Clarifies each element of your program Identifies indicators of change to be measured Identifies indicators of change to be measured Identifies targets for program impact Identifies targets for program impact The Logic Model

19 19 The amount of impact desired When data is collected The population to be measured Sources of information about conditions being measured Observable and measurable behaviors or conditions Targets (Goals) Data Interval (When) Applied to (Who) Data Sources Indicators What goes into outcomes planning Mission + Influencers  PROGRAM PURPOSE Outputs Activities Services Inputs Outcomes Outcomes Logic Model

20 20 Influencers To change process To add partners To change responsibilities Is responsibility equal? Which services produce outcomes? Project Partners Who does the program serve? Is the program effective? To fund program To increase funding To promote replication Funders Is the program meeting target audience needs? To improve the program To end program To start another program Your Organization How they will How they will use information use information What they want to know Influencer GoalWhenWhoData Source IndicatorsOutcomesAudienceActivities Services Program Purpose Mission Influencers Staff, participants, agencies, funding sources, competition, community groups, professional associations

21 21 Logic Model: Mission How do the organization’s mission and the program connect? How do the organization’s mission and the program connect? Is the link between mission and program purpose reasonable? Is the link between mission and program purpose reasonable? What action words connect the mission to the program? What action words connect the mission to the program? GoalWhenWho Data Source IndicatorsOutcomesAudience Activities Services Program Purpose Mission Influencers

22 22 We do what, for whom, for what outcome or benefit? Program Purpose The Science Museum, Public Libraries, and School District will partner to provide science kits, training, reference materials, and take-home kits for Head Start teachers to increase teachers’ ability to make science learning fun, increase kids’ science interest and knowledge, and engage parents in science play GoalWhenWho Data Source IndicatorsOutcomesAudience Activities Services Program Purpose Mission Influencers

23 23 Logic Model: Activities/Services Program activities make it possible to deliver services to program participants Program activities make it possible to deliver services to program participants (Most “activities” are managerial or administrative) Program services engage participants and produce outcomes Program services engage participants and produce outcomes Program services and activities are driven by audience characteristics Program services and activities are driven by audience characteristics GoalWhenWho Data Source IndicatorsOutcomesAudience Activities Services Program Purpose Mission Influencers

24 24 Activities Create kits Create kits Design training Design training Workshop Logistics Workshop LogisticsServices Teacher training Teacher training MESS kits and books MESS kits and books Take home kits Take home kits A Museum /Library/School Collaboration

25 25 Target Audience Head Start teachers in Alachua Co. Head Start teachers in Alachua Co. 3 to 5 year-olds from at risk families in Alachua 3 to 5 year-olds from at risk families in Alachua Families in Alachua Co. Families in Alachua Co. Knowing if you reaching the intended audience is critical–you must decide What information is critical to know? What information is critical to know? How will I get the information? How will I get the information? What are the confidentiality issues? What are the confidentiality issues? GoalWhenWho Data Source Indicators OutcomesAudience Activities Services Program Purpose Mission Influencers

26 26 Logic Model: Outcomes Outcomes State how you expect people to benefit from your program State how you expect people to benefit from your program State the intended results of your services State the intended results of your services Describe changes in skills, attitudes, behaviors, knowledge Describe changes in skills, attitudes, behaviors, knowledge GoalWhenWho Data Source IndicatorsOutcomesAudience Activities Services Program Purpose Mission Influencers

27 27 Samples Outcome 1 Teachers will be more confident helping students learn about science Outcome 2 Teachers will include science experiences in their classrooms GoalWhenWho Data Source IndicatorsOutcomesAudience Activities Services Program Purpose Mission Influencers

28 28 Logic Model: Indicators Indicators Are measurable conditions or behaviors that can show an outcome was achieved Are measurable conditions or behaviors that can show an outcome was achieved Say what you hope to see or know Say what you hope to see or know Are observable evidence of accomplishments, changes, or gains Are observable evidence of accomplishments, changes, or gains “Stand for” the outcome “Stand for” the outcome GoalWhenWho Data Source Indicators OutcomesAudience Activities Services Program Purpose Mission Influencers

29 29 Indicators The # and % of teachers who say they have at least “some” confidence on a 5- point scale (none, a little, some, a lot, complete confidence) Teachers will be more confident helping students learn about science Indicators Outcome 1 GoalWhenWho Data Source Indicators OutcomesAudience Activities Services Program Purpose Mission Influencers

30 30 Logic Model: Data Sources Data sources are tools and locations for information that will show what happened other forms of information Pre/Post test scores Pre/Post test scores Program records Program records Assessment reports Assessment reports Records from other organizations Records from other organizations Observations Observations GoalWhenWho Data Source IndicatorsOutcomesAudience Activities Services Program Purpose Mission Influencers

31 31 Data Sources Teacher surveys Data Sources The # and % of teachers who say they have at least “some” confidence on a 5-point scale Teachers will be more confident helping students learn about science Indicators Outcome 1

32 32 Logic Model: Applied to (Who) Decide if you will measure all participants, completers of the program, or another subgroup Decide if you will measure all participants, completers of the program, or another subgroup Special characteristics of the target audience can further clarify the group to be measured Special characteristics of the target audience can further clarify the group to be measured GoalWhen Who Data Source IndicatorsOutcomesAudience Activities Services Program Purpose Mission Influencers

33 33 Applied to All teachers N = 445 Applied to Teacher surveys Data Sources The # and % of teachers who say they have at least “some” confidence on a 5-point scale Teachers will be more confident helping students learn about science Indicators Outcome 1

34 34 Logic Model: Data Intervals (When) Outcome information can be collected at specific intervals (end of program, every 6 months) Outcome information can be collected at specific intervals (end of program, every 6 months) Data can be collected at the end of an activity or phase and as follow-up (after 3 workshops, after 2 years Data can be collected at the end of an activity or phase and as follow-up (after 3 workshops, after 2 years Data is usually collected at program start and end for comparison when increases in skill, behavior, or knowledge are expected Data is usually collected at program start and end for comparison when increases in skill, behavior, or knowledge are expected Goal When Who Data Source IndicatorsOutcomesAudience Activities Services Program Purpose Mission Influencers

35 35 Data Interval At end of program year 1 Data Interval All partici- pating teachers N = 445 Applied to Teacher survey Data Sources The # and % of teachers who say they have at least “some” confidence on a 5-point scale Teachers will be more confident helping students learn about science Indicators Outcome 1

36 36 Logic Model: Targets (Goals) Targets (goals) are chosen expectations for outcomes a program hopes to achieve  usually a percentage and/or number Targets (goals) are chosen expectations for outcomes a program hopes to achieve  usually a percentage and/or number Influencer expectations affect targets, which can also be based on a program’s past performance Influencer expectations affect targets, which can also be based on a program’s past performance Target WhenWho Data Source IndicatorsOutcomesAudience Activities Services Program Purpose Mission Influencers

37 37 Target At end of progra m year 1 Data Interval All partici- pating teachers N = 445 Applied to Teacher surveys Data Sources The # and % of teachers who say they have at least “some” confidence on a 5-point scale Teachers will be more confident helping students learn about science Indicators Outcome 1 50% Target

38 38 What should reports say? We wanted to do what? We wanted to do what? We did what? We did what? So what? So what? Above all Reports should meet influencer needs for Reports should meet influencer needs for information and program results Reports should guide program staff to Reports should guide program staff to improve outcomes for program participants

39 39 What will it cost? Assume 7-10% of program costs for program evaluation (non-research) What will you get? Low cost – Know numbers, audience characteristics, and customer satisfaction Low cost – Know numbers, audience characteristics, and customer satisfaction Low to moderate cost – Know changes in audience skills, knowledge, behaviors, and attitudes Low to moderate cost – Know changes in audience skills, knowledge, behaviors, and attitudes

40 40 What will you get? Moderate to high cost – Comparison groups can show attribution, short-term changes due to program Moderate to high cost – Comparison groups can show attribution, short-term changes due to program High cost – Long-term follow up, can attribute long-term changes to audience due to program services (research) High cost – Long-term follow up, can attribute long-term changes to audience due to program services (research)

41 41 Karen Motylewski Institute of Museum and Library Services 1100 Pennsylvania Avenue, NW Washington, DC 20506 202-606-5551http://www.imls.govkmotylewski@imls.gov For more information


Download ppt "1 I N S T I T U T E of M U S E U M and L I B R A R Y S E R V I C E S Washington, DC, www.imls.gov Knowing What Audiences Learn: Knowing What Audiences."

Similar presentations


Ads by Google