Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Performance Assessment
Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
Analyzing Student Work
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Designing an Effective Evaluation Strategy
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
 Reading School Committee January 23,
Session 3 - Health Needs Assessment
Data Collection* Presenter: Octavia Kuransky, MSP.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
Rest of Course Proposals & Research Design Measurement Sampling Survey methods Basic Statistics for Survey Analysis Experiments Other Approaches- Observation,
Objectives of Session Seven Complete in-class survey Discuss question formats and ordering Case study: face-to-face interviews v. self-administered questionnaires.
Evaluation. Practical Evaluation Michael Quinn Patton.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Keystone State Reading Conference October 29, 2012 Dr. Deb Carr, King’s College.
Innovate, Educate, and Transform Linda Fischer, Ed.D., Doctoral Support Center January 2014 Navigating the Dissertation.
1 National Training Programme for New Governors 2005 Module 3 Ensuring accountability.
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
An Overview of the New HCPSS Teacher Evaluation Process School-based Professional Learning Module Spring 2013 This presentation contains copyrighted material.
Reporting and Using Evaluation Results Presented on 6/18/15.
A Framework for Inquiry-Based Instruction through
Tools in Media Research In every research work, if is essential to collect factual material or data unknown or untapped so far. They can be obtained from.
Research Design and Methods Professor Tamo Chattopadhay Institute for Educational Initiatives & Kellogg Institute for International Studies University.
Too expensive Too complicated Too time consuming.
Educator Effectiveness Academy STEM Follow-Up Webinar December 2011.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Evaluating a Research Report
Welcome! Please join us via teleconference: Phone: Code:
Introduction to Evaluation Odette Parry & Sally-Ann Baker
Inquiry and Investigation. What was the TOPIC? PROBLEM? CIVIC INQUIRY?
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
Thoughts on the Role of Surveys and Qualitative Methods in Evaluating Health IT National Resource Center for HIT 2005 AHRQ Annual Conference for Patient.
Quality Assessment July 31, 2006 Informing Practice.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
Evaluation and Impact of Entrepreneurial Education and Training Malcolm Maguire Transnational High Level Seminar on National Policies and Impact of Entrepreneurship.
Beyond Data Why does evaluation seen so foreign to most?
March Madness Professional Development Goals/Data Workshop.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Program Evaluation.
Target -Method Match Selecting The Right Assessment.
Candidate Assessment of Performance (CAP): an Overview August 4, 2015 and August 10, 2015 Presented by: Jennifer Briggs.
Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College Making Sense of the.
Assessment Design. Four Professional Learning Modules 1.Unpacking the AC achievement standards 2.Validity and reliability of assessments 3. Confirming.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Readings n Text: Riddick & Russell –Ch1 stakeholders – p10 –Ch 2 an evaluation system –Proposal p25-36 – Ch 4 – Lit Review n Coursepack –GAO report Ch.
Dimensions of Success for STEM Learning (DOS): Observation Tool for Assessing STEM Learning in Out-of-School Time Settings 4H Webinar June 16, 2009.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
Monitoring and Evaluation in the GMS Learning Program 7 – 18 May 2012, Mekong Institute, Khon Kaen, Thailand Randy S. Balaoro, CE, MM, PMP Data Collection.
Rest of Course Proposals & Research Design Measurement Sampling
Are we there yet? Evaluating your graduation SiMR.
Developing Smart objectives and literature review Zia-Ul-Ain Sabiha.
Inquiry Road Map A Guidance System for 21 st Century Learning By Mary Ratzer.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
DATA COLLECTION METHODS IN NURSING RESEARCH
Rethinking data: Get creative!
Melanie Taylor Horizon Research, Inc.
Presentation transcript:

Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating Informal Science Education Media Saul Rockman and Jennifer Borse - Rockman et al

Disclaimer:

This presentation is largely based on: And our website:

Two perspectives: The P.I. ! The Evaluator 4 ?

Why do an evaluation? Ensure that your product/program is successful…and… Chapter 1: Intro to Evaluation Prove that it is successful!

What you used to think was a necessary evil…. still is!

What is an evaluation? Ensure that your product/program is successful…and… Formative Evaluation Chapter 1: Intro to Evaluation Summative Evaluation Prove that it is successful!

“Evaluation is not just for preparing good proposals, It is also an integral part of running good projects.” Lynn Dierking, in “Framework for Evaluating Impacts of Informal Science Education Projects”(2008)

Formative Evaluation Focused on development and improvement of a project Are components of the project being carried out as intended? If not, what has changed and why?  Is the project moving according to the projected timeline?  What is working well? What are the challenges?  Is the budget on track?  What needs to be done to ensure progress according to plan? Chapter 1: Intro to Evaluation

Summative Evaluation Measures the outcomes and impacts of a project: Were the project’s goals met? What components of the project were most effective? What specific impacts did the project have on intended audiences (as well as secondary audiences)? Chapter 1: Intro to Evaluation

Summative Evaluation: Informal Education and Outreach Framework Impact CategoryPublic AudiencesProfessional Audiences Awareness, knowledge or understanding (of) STEM concepts, processes, or careers Informal STEM education/ outreach research or practice. Engagement or interest (in)STEM concepts, processes, or careers Advancing informal STEM education/outreach field Attitude (towards)STEM-related topic or capabilities Informal STEM education/ outreach research or practice Behavior (related to)STEM concepts, processes, or careers Informal STEM education/ outreach research or practice Skills (based on)STEM concepts, processes, or careers Informal STEM education/ outreach research or practice OtherProject specific Source: Friedman, A. (Ed) (March 12, 2008) Framework for Evaluating Impacts of Informal Science Education Projects [On-line] (p. 11). Available at

Three Big Questions: 1. Are you doing what you said you were going to do? Chapter 1: Intro to Evaluation 2. How well is it (the project, program, or initiative) going? impact See NSF Framework Chapter 3 For more info about “impact.” 3. Does what you are doing have an impact?

Now you’re ready to start! Chapter 1: Intro to Evaluation 8

Step 1:Prepare Set the stage Chapter 2: Doing an Evaluation Gather background information Background Develop a Logic Model Logic Model: What’s going to happen and why? And don’t forget your Stakeholders!

Logic Model for the ISE Program Intended audience and strategic impact Activities that Target Public Audiences Mass Media Exhibits Learning Technologies Youth/Community Programs Grant recipient Collaborators and consultants Other stakeholders NSF Activities that Target Professional Audiences Seminars/ Conferences Professional Development Materials/ Publications Number of attendees Number of members Number of users Number of viewers Number of visitors Number of users Number of participants Awareness, knowledge or understanding of STEM concepts, processes or careers New skills based on engagement Behavior resulting From engagement Attitude towards STEM-related topics Engagement or Interest in STEM concepts, processes or careers New knowledge practices that advance the informal education field INPUTSACTIVITIESOUTPUTSOUTCOMESSTRATEGIC IMPACTS

Step 2:Design Plan Chapter 2: Doing an Evaluation Framing questions Organizing tools Identify key questions that will guide the evaluation - What do you want to know (Remember the 3 Big ?’s) Consider: Audience, Resources, and Time Constructs: concepts that can be measured Indicators: examples of success that can be measured

Intended Impacts, Indicators and Evidence

Question/Method Matrix

Step 3:Select Methods Chapter 2: Doing an Evaluation What you measure with depends on what you are measuring.

Step 3:Select Methods Quantitative Chapter 2: Doing an Evaluation Surveys or questionnaires Objective tests of comprehension Gate counts, television ratings, website hits Time spent in exhibits Number of posts to a website or comments/questions In-depth interviews Focus Groups Observations Analysis of authentic data, user/visitor created products Qualitative Analyze a large quantity of data Findings are more generalizable Get a more in-depth _understanding Helps with interpretation of _quantitative data

Tips Chapter 2: Doing an Evaluation Think about what you want to know and be able to say at the end of your evaluation.

Tips Chapter 2: Doing an Evaluation Gate counts Website hits and tracking data Television ratings User-created products Data from past research or evaluation efforts Consider what data you may already have access to, i.e., Existing Data:

Tips Chapter 2: Doing an Evaluation Don’t limit yourself to one method - consider using Multiple Methods Mixed Methods Evaluations

Step 4:Collect Data Chapter 2: Doing an Evaluation Surveys Types of questions Word Choice Sampling Pilot! Anonymity Personal questions Keep it short Strategies for collection

Types of Survey Questions

Step 4:Collect Data Chapter 2: Doing an Evaluation Assessments A B C D E Get your #2 pencils ready! Objective assessment: only one right answer Subjective assessment: more than one correct answer

Step 4:Collect Data Chapter 2: Doing an Evaluation Interviews & Focus Groups Structured vs. In-Depth One-on-one or pairs Small group (8-12) Find group consensus Encourage diversity in responses

Step 3:Collect Data Chapter 2: Doing an Evaluation Observations Who, What, Where, When, How? Use a rubric or structured protocol to ensure consistent data collection +

Step 5:Analyze Data Chapter 2: Doing an Evaluation Quantitative Prepare the data: code, enter, and check for errors Run analyses: what differences and patterns do you see? Qualitative Coding - start general…then get more specific Use instruments and goals to guide analysis Integrate/Synthesize Use data from different sources to get the big picture and draw conclusions

Step 6:Take Action! Chapter 2: Doing an Evaluation Report Make Recommendations/Changes Clear and concise; provides adequate evidence for claims and enough explanation to make sure readers understand your interpretation of the data You don’t have to report on every piece of data or every finding: Know when to say when! Be specific Plan for further evaluation after changes are made

Wide-Ruled Paper

Take a look at our website: Evaluationspringboard.org/science