E101 Section 8 October 31, 2012 Policy and Program Evaluation.

Slides:



Advertisements
Similar presentations
Effective Meetings.
Advertisements

E-Learning for Enterprise Managing online learners and learning e-Learning for Enterprise Clive Young & Wendy David.
Migrant Education Comprehensive Needs Assessment
I skate to where the puck is going to be, not to where it has been. —Wayne Gretzky.
A Guide to Education Research in the Era of NCLB Brian Jacob University of Michigan December 5, 2007.
Training for OCAN Users Day 2
PROGRAM EVALUATION Carol Davis. Purpose of Evaluation The purpose of evaluation is to understand whether or not the program is achieving the intended.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Building Team Facilitation Skills Presented by: Mary Jo Meyers M.S.
An Assessment Primer Fall 2007 Click here to begin.
SWRK 292 Thesis/Project Seminar. Expectations for Course Apply research concepts from SWRK 291. Write first three chapters of your project or thesis.
Program Management n KSPE 4250 n Ch 2. Vision Statement n A concise statement that describes the ideal state to which the organization aspires. u Include.
A Healthy Place to Live, Learn, Work and Play:
Assignment 2 Case Study. Criteria Weightage - 60 % Due Date – 11 th October 2012 Length of Analysis – 2500 words Leverage % including appendices,
Learning from Practice: Continuous Improvement and Evaluation A-011B Candice Bocala, Ed.D. Spring 2015 Module 1.
How to Develop the Right Research Questions for Program Evaluation
Action Research / Intervention Project Module II Notre Dame of Maryland University.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Reporting and Using Evaluation Results Presented on 6/18/15.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
2014 E DUCATIONAL T ECHNOLOGY P LAN P ROJECT K ICKOFF.
Washington State Teacher and Principal Evaluation 1.
E-101 Section 5 October 10, 2012 Current Global Education Issues.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Building Leadership Teams Driving Continuous Improvement Throughout the School! Session #3 January 2012.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Quote for today “Sometimes the questions are complicated and the answers are simple” - ?? ????? “Sometimes the questions are complicated and the answers.
IKT Software & Site Evaluation. Overview Part I –A Little Bit of Business –Warm-up Discussion –Site & Software Evaluation –Introduction to Project I Part.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
District Results Module Preview This PowerPoint provides a sample of the District Results Module PowerPoint. The actual Overview PowerPoint is 59 slides.
Working Definition of Program Evaluation
Outcome Based Evaluation for Digital Library Projects and Services
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
Welcome! Please join us via teleconference: Phone: Code:
Advancing Assessment Literacy Setting the Stage I: Engaging Stakeholders.
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
Measure What Matters Karen MacDonald, BGCA Midwest Leadership Conference October 16, 2009.
Experimental Research Methods in Language Learning Chapter 16 Experimental Research Proposals.
1 Michigan Department of Education Office of School Improvement One Common Voice – One Plan Michigan Continuous School Improvement (MI-CSI)
Copyright © 2014 by The University of Kansas Health Impact Assessment.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
1 Principals’ Student Achievement Meeting April 15, 2009 Leadership for Assessment Literacy From Paper to Practice.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
A-801 Section 5 October 10, 2012 Current Global Education Issues.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Academic Practicum Winter Academic Practicum Seminar2 Agenda 4 Welcome 4 Burning ??’s 4 Routines & Organizational Systems 4 Overview of Academic.
Our Community: THINGS ARE JUST NOT THE SAME!. UNIT SUMMARY: Children are often under the impression that the way things are in their world is the way.
A-801 Wednesday Section 1 W EDNESDAY, 17 TH O CTOBER, – P. M.
A801 Section 8 October 31, 2012 Policy and Program Evaluation.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Welcome to Program Evaluation Overview of Evaluation Concepts Copyright 2006, The Johns Hopkins University and Jane Bertrand. All rights reserved. This.
PA 8081 Engaging the Public in Policy and Planning 4:30 p.m. Passion, Professionalism, & the Capstone 4:45 p.m.Scopes of Work: Progress & Lessons Learned.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 1 Research: An Overview.
Analysis and Critical Thinking in Assessment 1. What is the problem? Gathering information Using information to inform decisions/ judgment Synthesising.
MAPS for the Future An Introduction to Person- Centered Planning Katie Shepherd, Fall 2009.
Session 2: Developing a Comprehensive M&E Work Plan.
Virtual Team Project and IBM Connections WRIT 340 Prof. Walker WRIT 340 Prof. Walker.
Plans for Phase III of Transition Age Youth Initiative.
Instructional Leadership Supporting Common Assessments.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Source: Patton, p1.. Program Evaluation: The systematic use of empirical information to assess and improve programs or policies.
Chapter 33 Introduction to the Nursing Process
Civic Practicum: Project Design and Proposal Writing
Chapter Five: Developing a Coalition Plan and Evaluation
Department of Political Science & Sociology North South University
Introduction to Program Evaluation
A Guide to the Sharing Information on Progress (SIP)
Presentation transcript:

E101 Section 8 October 31, 2012 Policy and Program Evaluation

 Situating Ourselves in the Course (1 minutes)  Logical Framework Problem Tree Analysis (10 mins)  Discussion of Weiss (40 mins)  Design an Evaluation (30 mins)  Housekeeping & Questions (5 minutes) Agenda

1 Introduction to Comparative and International Education 2 The Process of Policy Analysis 3 Education Policy Options Course Overview

1 Introduction to Comparative and International Education 2 The Process of Policy Analysis 3 Education Policy Options Course Overview –Week 7 – Assessing the Performance of an Education System –Week 8 – Policy and Program Evaluation –Week 9 – Curriculum, Standards, and Assessment

“Your problem definition should not include an implicit solution introduced by semantic carelessness. Projected solutions must be evaluated empirically and not legitimated merely by definition. Therefore, keep the problem definition stripped down to a mere description, and leave open where you will look for solutions.” (Bardach, 2009, pg. 7) Defining the Solution into the “Problem”

“New schools are being built too slowly.” vs. “There are too many schoolchildren relative to the currently available classroom space.” Example

1.List all the problems that come to mind. Problems need to be carefully identified: they should be existing problems, not possible, imagined or future ones. The problem is an existing negative situation, it is not the absence of a solution. 2.Identify a core problem (this may involve considerable trial and error before settling on one). 3.Determine which problems are “Causes” and which are “Effects.” 4.Arrange in hierarchy both Causes and Effects, i.e., how do the causes relate to each other - which leads to the other, etc. LogFrame Problem Analysis Tree

Where we are  Part II: The Process of Policy Analysis -Last week we explored Assessing the performance of an education system -This week we take a look at Policy and Program Evaluation  We read Evaluation (1998) by Carol Weiss, (Ch. 1-5)

Goals  Define and discuss key terms  Review main points from the reading  Compare frameworks from Weiss and Reimers & McGinn

Key Terms  Evaluation  What is?  Why is it important in international education policy research and policy making?  Summative Evaluation, Formative Evaluation, Process-Outcome  Identify the focus and purpose of each type of evaluation  Provide an example

Formative-Summative Process – Outcome  Formative Evaluation:  Study results feed back into program development. Often used during beginning stages of a program.  Summative Evaluation:  Provides information about a program’s effectiveness. Conducted after a program is completed. Helps decision makers decide whether to continue or stop a program  Process Evaluation:  Examines what happens during the program, from the perspective of the participant  Outcome Evaluation:  Examines what happens to the participants after the program is finished Evaluation: “The systematic assessment of the operation and/or the outcomes of a program or policy, compared to a set of explicit and implicit standards, as a means of contributing to the improvement of a policy or program.” (Weiss, p.4)

Main Points from Weiss  We will break into three groups for this portion of discussion  As a group…  Choose a spokesperson  Identify three to five main points from the chapters to which your group is assigned (10 minutes)  As a large group, discuss conclusions from small group discussions

Main Points from Weiss - Groups  Group 1: Chapters 1&2  Group 2: Chapters 2&3  Group 3: Chapters 4&5

Role of the Evaluator Weiss Reimers & McGinn  Empowerment Evaluation: community-conducted evaluation research  Collaborative Evaluation: researcher as co-investigator; no wall between practitioner and evaluator  Stakeholder Evaluation: engages different stakeholders, learns their concerns, assumptions, questions, data needs and intentions/reasons for conducting the evaluation Policy Dialogue as Organizational Learning:  the researcher involves the client and relevant stakeholders in the community, from planning the study and developing the instrument to analyzing the data and presenting the results of the study.  Dialogue among stakeholder groups and researcher(s)  Policy research as a wave How are the frameworks similar or different? Which approach would you use and why?

Design a Program Evaluation  Imagine that NFTE wants to conduct an alumni study in  Their goal with this study is to re-connect with our alumni in order to tell the story of the deep, long-term impact of their youth entrepreneurship programs.  They hope that this survey will help inform the field on the impact of youth entrepreneurship programs.

 NFTE has approximately 200,000 alumni in the U.S. They instituted formal systems for gathering alumni information only four years ago (2008).  Prior to that, they kept track of many of our program graduates on an ad hoc basis, largely via personal communications with founder, Steve Mariotti, and other NFTE staff members. NFTE Facts

 NFTE has basic data – name, address, NFTE program information, in some cases – for only about 10,000 alumni, plus piecemeal data on another 3,000 or so.  They also have a set of alumni connected to NFTE through Facebook and LinkedIn, some of which are likely duplicative of the 13,000.  Their system contains lots of incomplete data, many duplicates to be cleaned, and much data that is likely obsolete, in the numerous cases where alumni have not updated their information since their initial registration. Alumni Database

Challenges of Tracking Alumni  It is very difficult to find alumni who completed NFTE prior to 2008 when NFTE set up the database.  Moreover our alumni population is relatively young and therefore naturally transient. They are unlikely to prioritize updating their contact information with NFTE each time they relocate.  Finally, NFTE has not devoted significant staffing resources to alumni tracking.  THINK OF WAYS IN WHICH NFTE COULD POTENTIALLY USE SOCIAL MEDIA TO RECONNECT WITH ALUMNI.

Design an Evaluation In teams design an evaluation that will answer the following questions:  Where are NFTE graduates now? What have they accomplished? How does their business ownership, employment, and education compare to that of their broad peer group? Keep in mind that not only does NFTE want to answer these questions, but they also want to inform the field’s general understanding of the impact of entrepreneurship education programs that target youths.

Design an Evaluation  Familiarize yourself with NFTE (vision, mission statement, where they operate).   Design a survey tool that will help NFTE measure impact.  Please create a shared google doc for this exercise.  Invite  Think of existing datasets that NFTE could compare the results of their study to. They will use these comparisons to build a picture of our impact, into which we would integrate compelling individual stories.  Develop a timeline for NFTE. The data must be collected and analyzed before April 2013.