Making the Most of Multisite Evaluations ADD PLACE ADD DATE.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Evaluation on a Shoestring Participatory Techniques for the Evaluator with Minimal Funding.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Department of Education, Employment and Workplace Relations
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
What You Will Learn From These Sessions
Relationships between Involvement and Use in the Context of Multi-site Evaluation American Evaluation Association Conference November 12, 2009.
Program Evaluation Essentials. WHAT is Program Evaluation?
Laura Pejsa Goff Pejsa & Associates MESI 2014
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Evaluation. Practical Evaluation Michael Quinn Patton.
Algebra I Model Course Background. Education Reform Act signed into law by Governor Rell May 26, 2010 Includes many recommendations of the ad hoc committee.
An Integrated Approach to TGfU
How to Develop the Right Research Questions for Program Evaluation
Curriculum Transformation Moving towards the Reality.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Margaret J. Cox King’s College London
A Collaborative Community-Engaged Approach to Evaluation for the Alliance for Research in Chicagoland Communities M. Mason, PhD, 1,2 B. Rucker, MPH 3,4.
Techniques in Civic Engagement Presented by Bill Rizzo Local Government Specialist UW-Extension Local Government Center
Utilizing Mixed Methods Research for Survey Development: Application of NCHEC Competencies Retta R. Evans 1, Brian F. Geiger 1, Marcia R. O'Neal 1, Nataliya.
Too expensive Too complicated Too time consuming.
Connected Learning with Web 2.0 For Educators Presenter: Faith Bishop Principal Consultant Illinois State Board of Education
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Working Definition of Program Evaluation
Thomas College Name Major Expected date of graduation address
Outcome Based Evaluation for Digital Library Projects and Services
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
National Science Foundation 1 Evaluating the EHR Portfolio Judith A. Ramaley Assistant Director Education and Human Resources.
Guide to Membership Recruitment, Retention, Diversity and Inclusion.
Module 2 Stakeholder analysis. What’s in Module 2  Why do stakeholder analysis ?  Identifying the stakeholders  Assessing stakeholders importance and.
Inquiry and Investigation. What was the TOPIC? PROBLEM? CIVIC INQUIRY?
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
Planning and Integrating Curriculum: Unit 4, Key Topic 1http://facultyinitiative.wested.org/1.
Enquiring into Entrepreneurial School Leadership Sue Robson.
Copyright © 2014 by The University of Kansas Health Impact Assessment.
ATL’s in the Personal Project
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Situation Analysis Determining Critical Issues for Virginia Cooperative Extension.
1-2 Training of Process Facilitators 3-1. Training of Process Facilitators 1- Provide an overview of the role and skills of a Communities That Care Process.
Educator Effectiveness Academy Day 2, Session 1. Find Someone Who…. The purpose of this activity is to review concepts presented during day 1.
Facilitating UFE step-by-step: a process guide for evaluators Joaquín Navas & Ricardo Ramírez December, 2009 Module 1: Steps 1-3 of UFE checklist.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Session Objectives Analyze the key components and process of PBL Evaluate the potential benefits and limitations of using PBL Prepare a draft plan for.
Intel ® Teach Program International Curriculum Roundtable Programs of the Intel ® Education Initiative are funded by the Intel Foundation and Intel Corporation.
Introduction to the Framework: Unit 1, Getting Readyhttp://facultyinitiative.wested.org/1.
Chapter 8: Participant-Oriented Evaluation Approaches
Facilitate Group Learning
The Development and Validation of the Evaluation Involvement Scale for Use in Multi-site Evaluations Stacie A. ToalUniversity of Minnesota Why Validate.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Action Research. How trustworthy are the articles you are reading for your literature review? What other issue are you discovering?
Uncovering Critical Thinking in Social Studies Developed by ERLC/ARPDC as a result of a grant from Alberta Education to support implementation with Wally.
Action Research Purpose and Benefits Technology as a Learning Tool to Improve Student Achievement.
Relationships in the 21 st Century Parent Teachers Students Association (PTSA) Goals, Membership, Participation.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
Erin M. Burr, Ph.D. Oak Ridge Institute for Science and Education Jennifer Ann Morrow, Ph.D. Gary Skolits, Ed.D. The University of Tennessee, Knoxville.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
CSWE Overview This resource highlights key aspects of the mission of the Commission on Research and its goals for the next 5 years. It will then.
MICHAEL A. HARNAR DOCTORAL CANDIDATE, CLAREMONT GRADUATE UNIVERSITY AMERICAN EVALUATION ASSOCIATION ANNUAL CONFERENCE NOV 5, 2011; SESSION 842 ANAHEIM,
Using Citation Analysis to Study Evaluation Influence: Strengths and Limitations of the Methodology Lija Greenseid, Ph.D. American Evaluation Association.
Stages of Research and Development
Making the Most of Multisite Evaluations SAMEA Cape Town, South Africa August, 2010 Note to the presenter: Edit the title screen of this slide to reflect.
Presentation transcript:

Making the Most of Multisite Evaluations ADD PLACE ADD DATE

Note This material is based upon work supported by the National Science Foundation under Grant No. REC Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Today’s Agenda Overview and introductions What? Our research grounding So what? Implications for practice Now what? Application discussion

Session Goals Review the basics of UFE and PE Distinguish between participation and involvement in multisite settings Discuss how to increase the impact of multisite evaluations Apply these ideas to evaluation examples Brainstorm solutions to multisite evaluation involvement and use challenges

THREE-STEP INTERVIEW Think about your own evaluation experiences...

Question Think of a time when people truly used an evaluation that you were part of. –Describe that evaluation. –What distinguished it from other evaluations in which you have participated?

“BEYOND EVALUATION USE” Our NSF-funded research study

What This Research Was NOT… Our study did not focus on the traditional notion of utilization- focused evaluation– “intended use by intended users”

What Our Research Studied What happens to project staff who take part in a large-scale, multisite program evaluation Secondary potential users at multiple sites who participate throughout the evaluation process –How their involvement potentially leads to use –“[Un]intended use by [un]intended users”

Definitions Program –a major national funding initiative Project –one of many smaller efforts funded under a single program Multisite –multiple program sites that participate in the conduct of cross-site evaluation activity (Straw & Herrell, 2002)

“Beyond Evaluation Use” NSF Programs Name of Program Years of Evaluations Local Systemic Change through Teacher Enhancement (LSC) 1995 – present Advanced Technological Education (ATE) Collaboratives for Excellence in Teacher Preparation (CETP) Building Evaluation Capacity of STEM Projects: Math Science Partnership Research Evaluation and Technical Assistance Project (MSP-RETA) 2002 – present

Methods –Archival Review –Project Leader and Evaluator Survey –Interviews –NSF PI Survey –Journal Editor Inquiry –Citation Analysis National Science Foundation Grant #

Research Limitations Difficult to control for the variety of definitions in the field Memory issues for participants Lack of distinction between program and project in survey responses Sampling challenges and program variation

Research Strengths Unusual to receive funding for evaluation research Real world program examples Different from traditional utilization- focused evaluation focus Studied influence on the field and on projects themselves Use of varied and innovative methods

CONCEPTUAL GROUNDING What are the ideas this research studied? (What?)

Overarching Concepts Evaluation use/influence Involvement –Utilization-focused evaluation (UFE) –Participatory evaluation (PE)

Traditional Types of Evaluation Use TypeUse For Definition: The Use of Knowledge... InstrumentalAction... for making decisions Conceptual or Enlightenment Understanding... to better understand a program or policy Political, Persuasive, or Symbolic Justification... to support a decision someone has already made or to persuade others to hold a specific opinion

Definitions in “Beyond Evaluation Use” TermDefinition Evaluation use The purposeful application of evaluation processes, findings, or knowledge to produce an effect Influence ON evaluation The capacity of an individual to produce effects on an evaluation by direct or indirect means Influence OF evaluation (from Kirkhart, 2000) The capacity or power of evaluation to produce effects on others by intangible or indirect means

What Is Involvement? Not “participation” Not “engagement” Instead, think about how UFE and PE overlap

Overlap between UFE and PE UFE PE Key people take part throughout the evaluation process

Utilization-focused Evaluation (UFE) Evaluation done for and with specific, intended primary users for specific, intended uses -Patton (2008), Utilization-Focused Evaluation, 4 th Edition

The PERSONAL FACTOR in Evaluation "The presence of an identifiable individual or group of people who personally care about the evaluation and the findings it generates"

Key Collaboration Points in UFE Issues to examine (information primary intended users want/need) Methods to use (credibility in context) Analysis and interpretation of data Recommendations that will be useful

Overlap between UFE and PE UFE Primary intended users are involved in all key evaluation decisions PE Key people take part throughout the evaluation process

Participatory Evaluation (PE) Range of definitions –Active participation throughout all phases in the evaluation process by those with a stake in the program (King,1998) –Broadening decision-making and problem-solving through systematic inquiry; reallocating power in the production of knowledge and promoting social changes (Cousins & Whitmore,1998)

Principles of PE Participants OWN the evaluation The evaluator facilitates; participants plan and conduct the study People learn evaluation logic and skills as part of the process ALL aspects of the evaluation are understandable and meaningful Internal self-accountability is valued (Adapted from Patton, 1997)

Characteristics of PE 1.Control of the evaluation process ranges from evaluator to practitioners 2.Stakeholder selection for participation ranges from primary users to “all legitimate groups” 3.Depth of participation ranges from consultation to deep participation (From Cousins & Whitmore, 1998)

Cousins & Whitmore Framework

Interactive Evaluation Quotient LOW HIGH Evaluator Program staff, clients, community Involvement in decision making and implementation Participant-directedCollaborative Evaluator-directed PARTICIPATORY EVALUATION

Overlap between UFE and PE UFE Primary intended users are involved in all key evaluation decisions PE Participants help to plan and implement the evaluation Key people take part throughout the evaluation process

MULTI-SITE EVALUATIONS What happens when there are many sites involved in one study?

Challenges of UFE/PE in Multisite Settings Projects vary –Activities – Goals – –Budgets -- Stakeholders Projects may be geographically diverse –Distance -- Cost Programs each have multiple stakeholders so the “project” becomes a key stakeholder (Lawrenz & Huffman, 2003)

Prediction How might UFE and PE play out in multisite evaluations (MSE’s)?

The Focus of Our Research UFE Primary intended users (PIU’s) are involved in all key evaluation decisions PE Participants help to plan and implement the evaluation design Secondary potential users at multiple sites are involved throughout evaluation process

WHAT DID WE FIND OUT? After five years... so what?

What Our Research Found Secondary potential users did sometimes feel involved in the program evaluation and did sometimes use results What fostered feelings of involvement: –Meetings of all types; face-to-face best –Planning for use –The mere act of providing or collecting data

What Fostered Use Perception of a high quality evaluation Convenience, practicality, and alignment of evaluation materials (e.g., instruments) Feeling membership in a community

Remember the three-step interview results?

Implications for Practice 1.Set reasonable expectations for project staff –Consider different levels of involvement (depth OR breadth, not both necessarily) –Have projects serve as advisors or consultants –Have detail work completed by others/ outsiders 2. Address evaluation data concerns –Verify understanding of data definitions –Check accuracy (Does it make sense?) –Consider multiple analyses and interpretations

Implications for Practice (cont.) 3. Communicate, communicate, communicate -- Personal contact matters 4. Interface regularly with the funder –Understand the various contexts –Garner support for the program evaluation –Obtain help to promote involvement and use –Represent the projects back to the funder

Implications for Practice (cont.) 5. Recognize life cycles of people, projects, and the program –Involve more than one person per project –Understand the politics of projects 6. Expect tensions and conflict –Between project and program evaluation –Among projects (competition) –About how best to use resources

Implications for Practice (cont.) 7. Work to build community among projects and between projects/funder –Face-to-face interactions –Continuous communication –Asynchronous electronic communication –Be credible to project staff Recognized expertise “Guide on the side” not “sage on the stage”

APPLICATION PRACTICE Now what?

Application Activity Work in teams to discuss the assigned vignette. [Try the checklist.]

Vignette #1 Summary Health Technician Training Program: HTTP –Training to increase healthcare technicians –Issue: Program-level evaluation not relevant to project-level evaluation

Vignette #2 Summary Medical Communication Collaboration: MCC –Development of communications curricula for medical professional students –Issue: Projects do not use program- created evaluation tools and analysis

Vignette #3 Summary Professional Development for Districts: PDD –Funding for professional development projects in primary education –Issue: Local evaluators asked to provide program evaluation data one year after beginning project-level evaluation which took time away from the local evaluation

Vignette #4 Summary Foundation for Fostering Urban Renewal: FFUR –Evaluation technical assistance and consultative services program launched by grantor to provide direct technical assistance to any of their grantees. –Issue: Few grantees taking advantage of the assistance and consultation.

As you think about these ideas... Questions?

Summary Involvement in MSEs is different from participation in single site evaluations Involvement does promote use There are several ways to foster participants’ feelings of involvement Communication with participants and funders is critical

For Further Information Online - – PowerPoint developers: –Dr. Jean A. King –Dr. Frances Lawrenz –Dr. Stacie Toal –Kelli Johnson –Denise Roseland –Gina Johnson