How’s it Working? Evaluating Your Program MAAPS Conference, 7 May 2010 Debra Smith & Judah Leblang Program Evaluation & Research Group School of Education,

Slides:



Advertisements
Similar presentations
The 21st Century Context for
Advertisements

Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Session Learning Target You will gain a better understanding of identifying quality evidence to justify a performance rating for each standard and each.
Through Instructional Rounds
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
A Guide for College Assessment Leaders Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
Corry Bregendahl Leopold Center for Sustainable Agriculture Ames, Iowa.
Central Office Administrator Development and Evaluation Adaptations for Central Office Administrators.
Project Monitoring Evaluation and Assessment
Data Collection* Presenter: Octavia Kuransky, MSP.
An Assessment Primer Fall 2007 Click here to begin.
SLAs – MAKING THE SHIFT. Session Goals Deepen understanding of Inspiring Education, Literacy and Numeracy Benchmarks (embedded in Curriculum Redesign)
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
The Academic Assessment Process
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Measuring Success & Impact: Challenges & Opportunities for STEM Early Learning Programs Tiffany R. Lee University of Colorado Boulder University of Washington.
Evaluation. Practical Evaluation Michael Quinn Patton.
Student Assessment Inventory for School Districts Inventory Planning Training.
Customer Focus Module Preview
Molly Chamberlin, Ph.D. Indiana Youth Institute
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
Work Group on Student-Centered Learning in High School October 22, :00-3:30pm Superintendent's Conference Room 960 Main Street, 8 th Fl.
How to Develop the Right Research Questions for Program Evaluation
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Principles of Assessment
Using a logic model to help you use the ePortfolio Implementation Framework Katherine Lithgow.
WELCOME. AGENDA  LCFF/LCAP Review  LCAP Process  Community Input/Findings  2014/15 LCAP  Plan Alignment- LEAP/LCAP/SPSA  Planning and Input.
Adolescent Literacy – Professional Development
Washington State Teacher and Principal Evaluation 1.
The Evaluation Plan.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Reflect and Revise: Evaluative Thinking for Program Success Tom DeCaigny and Leah Goldstein Moses.
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
Evaluation Tools & Approaches for Engaging Stakeholders: The AREA Approach Presentation at the Canadian Evaluation Society Dr. Mariam Manley, Principal.
August 7, Market Participant Survey Action Plan Dale Goodman Director, Market Services.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
ISLN Network Meeting KEDC SUPERINTENDENT UPDATE. Why we are here--Purpose of ISLN network New academic standards  Deconstruct and disseminate Content.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
WHAT YOU SHOULD EXPECT FROM YOUR EVALUATOR: PROMISING PRACTICAL PRACTICES July 28, 2011 Hi-TEC Conference, San Francisco.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Getting Started With Your ATE Evaluation ATE PI Conference October 24, 2012 This material is based upon work supported by the National Science Foundation.
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
Module 3: Unit 2, Session 1 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 2, Session 1.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
1. Administrators will gain a deeper understanding of the connection between arts, engagement, student success, and college and career readiness. 2. Administrators.
Facilitate Group Learning
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Science Department Draft of Goals, Objectives and Concerns 2010.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
Quality Assurance Review Team Oral Exit Report School Accreditation Sugar Grove Elementary September 29, 2010.
Developing Monitoring & Evaluation Frameworks: Process or Product? Anne Markiewicz.
Are we there yet? Evaluating your graduation SiMR.
1 Introduction Overview This annotated PowerPoint is designed to help communicate about your instructional priorities. Note: The facts and data here are.
Inquiry Road Map A Guidance System for 21 st Century Learning By Mary Ratzer.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
ACS WASC/CDE Visiting Committee Final Presentation Panorama High School March
Instructional Leadership Supporting Common Assessments.
PROGRAM EVALUATION A TOOL FOR LEARNING AND CHANGE.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Evaluating the Quality and Impact of Community Benefit Programs
WHAT is evaluation and WHY is it important?
Setting Writing Goals in Science The Living Environment
Presentation transcript:

How’s it Working? Evaluating Your Program MAAPS Conference, 7 May 2010 Debra Smith & Judah Leblang Program Evaluation & Research Group School of Education, Lesley University

PERG Founded 1976 Over 600 program evaluation and research studies in various educational settings Also offers professional development and consultation

Session participants will: Be introduced to the basics of program evaluation through an example Define a question or questions about their own program Identify methods for collecting data that would help to answer their question/s Discuss next steps

What is program evaluation? A type of applied research focused on systematically collecting and analyzing data to help answer questions about a program, or some aspect of a program, in order to make decisions about it.

Purposes Accountability Program development Generating knowledge

Formative vs Summative Formative evaluation offers feedback along the way to improve programs Summative evaluations “sum up” the results of a program at the end of a period of development or implementation.

Audiences Funders Program leaders Program participants Organizational partners Others

Evaluation process 1. Goals/ logic model 2. Questions 3. Evaluation plan 4. Data collection 5. Data analysis 6. Reporting PROGRAM

An example: Evolutions After school program begun in 2005, connected with Peabody Museum of Natural History at Yale University—initially involved approximately 40 low SES/ minority students

Evolutions program goals To provide opportunities for students to: Prepare for post-secondary (college) education; Learn about scientific—and other careers; Expand their knowledge of and interest in science (science literacy); Develop transferable skills for the future; and learn about the Peabody Museum/museum careers.

Logic models Map a coherent chain of connections between goals, resources, activities and what you expect (short term), want (over an intermediate period) and hope (in the long term) to happen. They also reflect your assumptions and theory of action or change.

Logic Model Key Concepts CategoryResources or Inputs Activities — OutputsShort-term outcomes Long-term outcomes General information Staff, funds, materials, space, etc What we plan to do/who we will do it for The results of our program —direct outputs Outcomes (changes) at completion of the project year or soon after Outcomes (changes) several years beyond completion of the project

And EVO example CategoryResources or Inputs Activities—OutputsShort-term outcomes Long-term outcomes General information Staff, funds, materials, space, etc What we plan to do/who we will do it for The results of our program— direct outputs Outcomes (changes) at completion of the project year or soon after Outcomes (changes) several years beyond completion of the project EVO examples Full time project director funds from Peabody Museum and other funders, classroom space, etc. In-depth exploration of: science topics tours of Peabody collections Yale scientist labs Students will meet: at least 6 scientists students will visit no less than 2 natural history museums Students will: learn skills associated with producing a museum exhibition Understand key science themes Students will: understand different types of careers within disciplines understand the college application process be inspired to pursue a career in the sciences

Goal Rationale Assumptions Resources ActivitiesOutputs Short-term outcomes Mid-term outcomes Long-term outcomes Logic models may look different..

Develop a logic model for your own program/ project

Evaluation process 1. Goals/ logic model 2. Questions 3. Evaluation plan 4. Data collection 5. Data analysis 6. Reporting PROGRAM

Questions: Think Goldilocks Specific but not too detailed Important but not too broad in scope

Key Questions: Part One How does EVO prepare students for college or high school? How are EVO students involved in developing an exhibit at the museum? Do students develop increased “science literacy,” as defined by EVO staff?

Key Questions: Part Two How (if at all) do students express more confidence about and interest in doing science? Are students more aware of careers in science? How (if at all) do students demonstrate increased knowledge of the college application process, and develop criteria for choosing a college that meets their needs?

What questions do you want to answer about your program?

Evaluation process 1. Goals/ logic model 2. Questions 3. Evaluation plan 4. Data collection 5. Data analysis 6. Reporting PROGRAM

Data collection methods Observation Interviews/ focus groups Surveys Document/artifact review

PERG Evaluation Matrix Evolutions Data collection activities>> EVALUATION QUESTIONS: Observe Evo students Student focus groups Interview project director Review project docs and artifacts Examine pre-post survey Student prep for college/academic planning √√√√√ Student involvement in museum exhibit √√√ Students' development of science literacy √√√√ Student learning√√√√√ Students' interest in science/environment √√√√√ Students' confidence in doing science √√√ Students' interest in/knowledge of science careers √√√√

Technical considerations: Validity Will the data answer the questions? Are we asking the right questions?

Triangulation Is there adequate triangulation (use of multiple methods and/or data sources) to ensure validity?

Drafting your own matrix: What data will help you answer your questions?

Evaluation process 1. Goals/ logic model 2. Questions 3. Evaluation plan 4. Data collection 5. Data analysis 6. Reporting PROGRAM

Collecting data Make sure your plan is doable given time and resources available. Design instruments to focus your data collection, ensure consistency and avoid bias. Be organized: take notes, develop a system for tracking/ filing your data.

Collecting data Communicate clearly about what you are doing, why and how the findings will be shared and used. Be mindful of human subjects protections. Does your organization have an institutional review board (IRB)?

The First Year: site visit On-site data collection Focus groups with students Interviews with director, project staff Observation of end of year event Parent interviews

Evaluation process 1. Goals/ logic model 2. Questions 3. Evaluation plan 4. Data collection 5. Data analysis 6. Reporting PROGRAM

Analyzing data What stands out? What are the patterns? What are the similarities? What are the differences? Is more information needed?

Reliability Are the patterns in the data, or judgments about the data, consistent?

Validity, again Is the data helping you answer the questions? Is the data credible?

Evaluation process 1. Goals/ logic model 2. Questions 3. Evaluation plan 4. Data collection 5. Data analysis 6. Reporting PROGRAM

Reporting Consider purpose and audience/s Reporting relevant findings, questions/ recommendations Engaging stakeholders in discussion Using findings to inform next steps

Results of the first-year evaluation The impact of the evaluation on EVO— more focused program, clearer objectives, suggestions for sustainability. Evidence of program success: Retention, student engagement, positive changes in students’ view of doing science and scientists.

The Ongoing Evaluation-- shaping the program: Implementation of evaluator suggestions—examples: informational interviewing, developing a smaller exhibit, refining requirements for students

EVO: 2006-Today Continued development and expansion of EVO—2006 until today: Expansion of the program from approximately 40 to more than 80 students, introduction of internships and Sci Corps. – Different areas of science focus— environmental awareness, geoscience, depending on funding sources.

Evaluation resources W.K. Kellogg Foundation Evaluation Handbook Kellogg Logic Model Development Guide Basic Guide to Program Evaluation

Evaluation resources Program Evaluation & Research Group Lesley University 29 Everett St. Cambridge, MA