Erin M. Burr, Ph.D. Oak Ridge Institute for Science and Education Jennifer Ann Morrow, Ph.D. Gary Skolits, Ed.D. The University of Tennessee, Knoxville.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Engaging Your Community in Practical Program Evaluation Stephanie Welch, MS-MPH, RD, LDN Knox County Health Department Phone (865)
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
What You Will Learn From These Sessions
CULTURAL COMPETENCY Technical Assistance Pre-Application Workshop.
Relationships between Involvement and Use in the Context of Multi-site Evaluation American Evaluation Association Conference November 12, 2009.
Program Evaluation Essentials. WHAT is Program Evaluation?
Social Science Research and
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Evaluation. Practical Evaluation Michael Quinn Patton.
Learning through Service: The Contribution of Service- Learning to First Year Pre-Service Teachers Miranda Lin, Ph.D., Alan Bates, Ph.D., & Ashley Olson.
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
“It Gave Me Confidence”: How Field Experience and Service Learning Impact Pre-service Teacher Learning Regarding Diversity and Multiculturalism in an Urban.
How to Develop the Right Research Questions for Program Evaluation
Guide to Evidence for WASC Accreditation Dr. Robert Gabriner City College of San Francisco Student Learning Outcomes Workshop Strand 3.
The Role of Assessment in the EdD – The USC Approach.
2014 AmeriCorps External Reviewer Training
Reflective Practice in Nursing & Cultural Competency Education Yolanda Ogbolu, PhD, CRNP-Neonatal Assistant Professor & Deputy Director Office of Global.
Assessment of Staff Needs and Satisfaction in an Evergreen Environment David Green NC Cardinal Consultant State Library of North Carolina.
A Collaborative Community-Engaged Approach to Evaluation for the Alliance for Research in Chicagoland Communities M. Mason, PhD, 1,2 B. Rucker, MPH 3,4.
Title: A study… Name Department of Early Childhood Education, University of Taipei References Hoover-Dempsey, K. V., & Sandler, H. (1995). Parental involvement.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Too expensive Too complicated Too time consuming.
Program Evaluation and Logic Models
Conducting Community Health Research
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Strategic Prevention Framework Overview Paula Feathers, MA.
Carrie E. Markovitz, PhD Program Evaluation: Challenges and Recommendations July 23, 2015.
Building State Capacity: Tools for Analyzing Transition- Related Policies Paula D. Kohler, Ph.D., Western Michigan University National Secondary Transition.
The contents of this presentation were developed under a grant from the US Department of Education, #H323A However, these contents do not necessarily.
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
Logic Models and Theory of Change Models: Defining and Telling Apart
Adult Drug Courts: The Effect of Structural Differences on Program Retention Rates Natasha Williams, Ph.D., J.D., MPH Post Doctoral Fellow, Morgan State.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
Gabriela Pérez Yarahuán Universidad Iberoamericana Exploring Politics, Accountability and Evaluation Use in the Mexican Federal Government Education Programs.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Environmental Management System Definitions
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Defining and Implementing Standards-Based Practicum Competencies at a University Counseling Center 2007 APPIC Membership Meeting and Conference Presented.
Evidence Based Coaching Certificate Program School of Human and Organization Development Dr. Leni Wildflower, Director Dr. Katrina.
Program Assessment: Choosing Assessments Specify intended outcomes Measure whether students are meeting those outcomes Improve your program based on results.
 2007 Johns Hopkins Bloomberg School of Public Health Introduction to Program Evaluation Frances Stillman, EdD Institute for Global Tobacco Control Johns.
Promoting a Culture of Evidence Through Program Evaluation Patti Bourexis, Ph.D. Principal Researcher The Study Group Inc. OSEP Project Directors’ Conference.
The Development and Validation of the Evaluation Involvement Scale for Use in Multi-site Evaluations Stacie A. ToalUniversity of Minnesota Why Validate.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
Program Evaluation Principles and Applications PAS 2010.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
ClimateQUAL™: Organizational Climate and Diversity Assessment Sue Baughman Texas Library Association April 2009.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Planning for Research Uptake through a Research Communication Strategy (ResCom)
Construct Validity of the Culturally Engaging Campus Environments Scale for White Students and Students of Color Samuel D. Museus University of Denver.
By Dr. Talat AnwarAdvisor Centre for Policy Studies, CIIT, Islamabad Centre for Policy Studies, CIIT, Islamabad
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Enhancing Evaluation Stakeholder Responsiveness Through Collaborative Development of Data Collection Instruments Karen Kortecamp, PhD The George Washington.
How to Influence Policy Presented By George F. Grob Center for Public Program Evaluation October 2013 Tips for Evaluators.
Using Evaluation Training to Create Change: The Influence of the Evaluation Fellows Program Amelia E. Maynard Jean A. King.
MATERI #6 Proses Perancangan Intervensi
Health Education THeories
QIC-AG Logic Model Template
MEASURES OF SUCCESS: Assessment and Evaluation
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Title: A study… Name Abstract Intervantions Discussion Introduction
Involving Teachers in Evaluation
Logic Models and Theory of Change Models: Defining and Telling Apart
Presentation transcript:

Erin M. Burr, Ph.D. Oak Ridge Institute for Science and Education Jennifer Ann Morrow, Ph.D. Gary Skolits, Ed.D. The University of Tennessee, Knoxville Measuring Evaluation Use and Influence Among Project Directors of State Gaining Early Awareness and Readiness for Undergraduate Programs Grants

Overview Purpose to develop an instrument to measure evaluation use, evaluation influence, and impacting factors Participants current state project directors of Gaining Early Awareness and Readiness for Undergraduate Programs (GEAR UP) GEAR UP nationwide Dept. of Education college access grant program developed in order to prepare low-income students to meet the requirements for college enrollment and to succeed at the postsecondary level 6-year grants serving schools in multiple counties

Evaluation Use Instrumental Use “results are used in making decisions about program structure and function” ( Clavijo, Fleming, Hoermann, Toal, & Johnson, 2005) Conceptual Use “something that is newly learned about a program, its participants, its operations, or outcomes through an evaluation” ( Henry & Mark, 2003) Symbolic Use “involves drawing on evaluation evidence in attempts to convince others to support a political position, or to defend such a position from attack” ( Leviton & Hughes, 1981) “use of evaluation findings to retrospectively support a decision made prior to the evaluation finding” ( Henry & Rog, 1998) Process Use “…refers to and is indicated by individual changes in thinking and behavior, and program or organizational changes in procedures and culture, which occur among those involved in evaluation as a result of the learning that occurs during the evaluation process” ( Patton, 1997)

Kirkhart’s Integrated Theory of Influence Kirkhart, K. E. (2000). “Reconceptualizing Evaluation Use: An Integrated Theory of Influence.” In V. Caracelli and H. Preskill (eds.), The Expanding Scope of Evaluation Use. New Directions for Evaluation, no. 88. San Francisco: Jossey-Bass. “The term influence (the capacity or power of persons or things to produce effects on others by intangible or indirect means) is broader than use, creating a framework with which to examine effects that are multidirectional, incremental, unintentional, and instrumental.” (p. 7)

Henry and Mark’s Three Level Model of Evaluation Influence Levels of InfluenceIndividual Attitude Change Salience Elaboration Priming Skill Acquisition Behavioral Interpersonal Justification Persuasion Change Agent Social Norms Minority Opinion Influence Collective Agenda Setting Policy Oriented Learning Policy Change Diffusion Henry, G. T., & Mark, M. M. (2003). Beyond use: Understanding evaluation’s influence on attitudes and actions. American Journal of Evaluation, 24,

Factors That Impact Evaluation Use ImplementationDecision & Policy Setting Quality Credibility Relevance Communication Findings Timeliness Information needs Decision characteristics Politics Funding Competing information Personal characteristics Commitment to evaluation Cousins, J. B., & Leithwood, K. A. (1986). Current empirical research in evaluation utilization. Review of Educational Research, 56,

Participants, Method, and Results Participants 17 current state GEAR UP project directors 44% response rate Survey completion 10 online 4 paper-and-pencil 3 PDs used their program’s evaluations in all four types of use, reported influence at all three levels, and identified multiple factors that impacted their use of the evaluations.

Levels of Influence & Change MechanismsTypes of Evaluation Use Individual InstrumentalConceptualSymbolicProcess Attitude Change51 Salience11 Elaboration11 Priming11 Skill Acquisition1 Behavioral21 Interpersonal Justification1 Persuasion1 Change Agent111 Social Norms1 Minority Opinion Influence1 Collective Agenda Setting1 Policy Oriented Learning2 Policy Change1 Diffusion

Survey Instrument Measure# of ItemsCronbach’s alpha Evaluation UseTotal = 31* Instrumental7.84 Conceptual8.86 Symbolic8.88 Process8.92 Evaluation InfluenceTotal = 27 Individual16.93 Interpersonal7.82 Collective4.78 Impacting FactorsTotal = 13 Implementation6.84 Decision and policy setting7.89 Note. N = 17. *Four items were “other” types of use in each category, so they could not be labeled in terms of influence. Scale: 0 = No extent, 1 = Some extent, 2 = A moderate extent, 3 = A great extent, 4 = A very great extent.

Limitations and Implications for Use Limitations Participation Low number of respondents - limited statistical analysis No PDs participated in focus groups - limited interpretation of findings Self-report instrument – social desirability Generalizability – limited to project directors of federal grants Findings from this study can be used in GEAR UP to promote communication between grantees and evaluators about evaluation use and influence, raise awareness about the consequences of evaluations, and guide the design of grantee capacity-building workshops and training sessions on evaluation use.

Implications for Other Evaluators Use measure with project directors (PD) of other programs to Raise awareness about the types of use among PDs Track use and influence over time (e.g., multiple-year grants) Reflective practice Compare survey results to intended uses specified by the client (PD) at the beginning of an evaluation Assess the impact of their work (if responses are validated) and identify areas for improvement in the future To use as a conversation starter with program directors and staff about why they did or did not choose to use their evaluation results.

Directions for Future Research Additional research is needed to validate this instrument. More participants Different federal grant programs Addition of focus groups, interviews, or observations New items could be added to assess the collective change mechanism, diffusion, or a separate measure could be developed to assess diffusion. Incorporate Kirkhart’s Integrated Theory of Influence into measurement. Items could be reworded or expanded up by addressing the dimensions of time, source, and intention as they apply to each type of use.