The Development and Validation of the Evaluation Involvement Scale for Use in Multi-site Evaluations Stacie A. ToalUniversity of Minnesota Why Validate.

Slides:



Advertisements
Similar presentations
Critical Reading Strategies: Overview of Research Process
Advertisements

RESEARCH CLINIC SESSION 1 Committed Officials Pursuing Excellence in Research 27 June 2013.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
Copyright © 2014 by The University of Kansas Collecting and Analyzing Data.
Relationships between Involvement and Use in the Context of Multi-site Evaluation American Evaluation Association Conference November 12, 2009.
8. Evidence-based management Step 3: Critical appraisal of studies
Developing a Statistics Teaching and Beliefs Survey Jiyoon Park Audbjorg Bjornsdottir Department of Educational Psychology The National Statistics Teaching.
Culture and psychological knowledge: A Recap
Reviewing and Critiquing Research
Standards for Qualitative Research in Education
MSc Applied Psychology PYM403 Research Methods Validity and Reliability in Research.
Pey-Yan Liou and Frances Lawrenz Quantitative Methods in Education of the Department of Educational Psychology, University of Minnesota Abstract This research.
Critique of Research Outlines: 1. Research Problem. 2. Literature Review. 3. Theoretical Framework. 4. Variables. 5. Hypotheses. 6. Design. 7. Sample.
Survey Research Chapter 17: How To Design And Evaluate Research In Education James Blackwood AED 615 – Fall Semester 2006.
Validity Lecture Overview Overview of the concept Different types of validity Threats to validity and strategies for handling them Examples of validity.
BASIC STATISTICS WE MOST OFTEN USE Student Affairs Assessment Council Portland State University June 2012.
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Measurement and Data Quality
Chapter 4 Principles of Quantitative Research. Answering Questions  Quantitative Research attempts to answer questions by ascribing importance (significance)
1 The Literature Review March 2007 (3). 2 The Literature Review The review of the literature is defined as a broad, comprehensive, in- depth, systematic,
DEVELOPING ALGEBRA-READY STUDENTS FOR MIDDLE SCHOOL: EXPLORING THE IMPACT OF EARLY ALGEBRA PRINCIPAL INVESTIGATORS:Maria L. Blanton, University of Massachusetts.
ENHANCE Update Research Underway on the Validity of the Child Outcomes Summary (COS) Process ECO Center Advisory Board Meeting March 8, 2012 Arlington,
Research Methods Key Points What is empirical research? What is the scientific method? How do psychologists conduct research? What are some important.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Undertaking Research for Specific Purposes.
Foundations of Educational Measurement
McMillan Educational Research: Fundamentals for the Consumer, 6e © 2012 Pearson Education, Inc. All rights reserved. Educational Research: Fundamentals.
Analyzing Reliability and Validity in Outcomes Assessment (Part 1) Robert W. Lingard and Deborah K. van Alphen California State University, Northridge.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
How to Write a Critical Review of Research Articles
© 2005 Pearson Education Canada Inc. Chapter 2 Sociological Investigation.
Frances Lawrenz and The Noyce evaluation team University of Minnesota 1 Acknowledgement: This project was funded by National Science Foundation (Grant#REC )
Evaluating a Research Report
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Experimental Research Methods in Language Learning Chapter 1 Introduction and Overview.
Research Seminars in IT in Education (MIT6003) Research Methodology I Dr Jacky Pow.
URBDP 591 I Lecture 3: Research Process Objectives What are the major steps in the research process? What is an operational definition of variables? What.
Introduction to Validity
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
By Cao Hao Thi - Fredric W. Swierczek
Measurement Validity.
Question paper 1997.
Sociological Research Methods. The Research Process Sociologists answer questions about society through empirical research (observation and experiments)
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 17 Assessing Measurement Quality in Quantitative Studies.
W W W. C E S. C L E M S O N. E D U / G E / Planning Engineering Education Research Facilitator: Matthew W. Ohland.
Experimental Research Methods in Language Learning Chapter 5 Validity in Experimental Research.
Criteria for selection of a data collection instrument. 1.Practicality of the instrument: -Concerns its cost and appropriateness for the study population.
Chapter Eight: Quantitative Methods
Chapter 6 - Standardized Measurement and Assessment
Environmental Systems and Society Internal Assessment.
Michigan Assessment Consortium Common Assessment Development Series Module 16 – Validity.
Erin M. Burr, Ph.D. Oak Ridge Institute for Science and Education Jennifer Ann Morrow, Ph.D. Gary Skolits, Ed.D. The University of Tennessee, Knoxville.
PSYCH 610 guide / psych610guidedotcom.  PSYCH 610 Week 1 Individual Assignment Research Studies Questionnaire  PSYCH 610 Week 2 Individual Assignment.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Measurement and Data Quality.
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
Using Citation Analysis to Study Evaluation Influence: Strengths and Limitations of the Methodology Lija Greenseid, Ph.D. American Evaluation Association.
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
Jeanette Gurrola Psychology Department School of Behavioral & Organizational Sciences Claremont Graduate University American Evaluation.
Statistics & Evidence-Based Practice
Quality Assurance processes
Writing a sound proposal
VALIDITY by Barli Tambunan/
Leacock, Warrican and Rose (2009)
Reliability and Validity in Research
Unit 6 Research Project in HSC Unit 6 Research Project in Health and Social Care Aim This unit aims to develop learners’ skills of independent enquiry.
Week 3 Class Discussion.
Presentation transcript:

The Development and Validation of the Evaluation Involvement Scale for Use in Multi-site Evaluations Stacie A. ToalUniversity of Minnesota Why Validate a Scale for Involvement? 1)Improve research on participatory evaluation by making it easier for researchers to replicate studies in various settings. 2)Identify factors critical to involvement so that evaluators can incorporate activities into multi-site evaluations to help participants feel involved. 3)Justify the investment of time and resources in participatory approaches if there is more research available to demonstrate a positive relationship between involvement and evaluation use. Theoretical Framework: Messick’s Unitary Concept of Validity Validity is “an integrated evaluative judgment of the degree to which empirical evidence and theoretical rationales support the adequacy and appropriateness of inferences and actions based on test scores or other modes of assessment.” (Messick, 1989, 1995a, 1995b) 1. Discussions that focused the evaluation 2. Identifying evaluation planning team members 3. Developing the evaluation plan 4. Developing data collection instruments 5. Developing data collection processes 6. Collecting data 7. Reviewing collected data for accuracy and/or completeness 8. Analyzing data 9. Interpreting collected data 10. Writing evaluation reports 11. Presenting evaluation findings (e.g., to staff, to stakeholders, to an external audience) Funded by the National Science Foundation, this three-year research project is studying the use and influence of evaluations of four NSF-funded programs by examining the relationship between the extent of involvement of evaluation stakeholders and the long-term impact of the evaluations on project staff, on the science, technology, engineering, and mathematics (STEM) community, and on the evaluation community. The Beyond Evaluation Use Research Team Theoretical Soundness 1. Amount of published research 2. Data from think-alouds 3. Interview data Internal Consistency 1. Item variance 2. Scale reliability (alpha=.94) Inclusive of Relevant Activities & Processes 1. Expert opinion 2. Scale items mentioned in interviews 3. Types of involvement (mentioned in literature or interviews) covered by scale Measures Actual Involvement 1. Survey response distributions 2. Project interviews Statistical Factors Match Rational Structures 1. Exploratory Factor Analysis 2. Confirmatory Factor Analysis Differentiates between Groups that are Rationally or Theoretically Different 1. Significant difference between evaluators and non-evaluators in implementation, but not in planning. Correlates as Expected with Other Variables 1. Significant, positive correlation with evaluation use. Measures Involvement in Other Settings 1. Equally strong reliabilities for each program, but low level of involvement in one program. Consequential Validity 1. Discussion of possible biases related to multicultural validity and evaluation use. Messick’s Aspect of Validity Evidence Presented Strength of Evidence StrongWeak The Beyond Evaluation Use Research Project Key Validity References Brualdi, A. (1999). Traditional and modern concepts of validity (Report No. ED435714). Washington D.C.: ERIC Clearinghouse on Assessment and Evaluation. Kane, M. T. (1992). Quantitative methods in psychology: An argument-based approach to validity. Psychological Bulletin, 112(3), Kane, M. T. (2001). Current concerns in validity theory. Journal of Educational Measurement, 38(4), Messick, S. (1989). Meaning and values in test validation: The science and ethics of assessment. Educational Researcher, 18(2), Messick, S. (1995a). Standards of validity and the validity of standards in performance assessment. Educational Measurement: Issues and Practice, 14(4), 5-8. Messick, S. (1995b). Validity of psychological assessment: Validation of inferences from persons' responses and performances as scientific inquiry into score meaning. American Psychologist, 50(9), Implications of this Research Factor 1: Involvement in Planning The Evaluation Involvement Scale Response Options: 1=No 2=Yes, a little 3=Yes, some 4=Yes, extensively N/A=This activity did not take place Question Stem: To what extent were you involved in… Factor 2: Involvement in Implementation Primary Data Sources PI & Evaluator Online Survey Aug 2006 – Jan 2007 Topic: Involvement and use in four NSF multi- site evaluations. Sample Size: 372/810 Response Rate: 46% *Non-respondent study indicated no significant differences in levels of involvement between respondents and non- respondents. Interviews Mar 2007 – Apr 2007 Topic: Follow-up to survey with two questions related to involvement. Sample Size: 12 respondents who have various levels of involvement and use. Literature Review Fall 2005 – June 2007 Topic: Participatory evaluation research studies. Scope: 27 empirical and theoretical articles directly related to participatory evaluation. Validated evaluation research tool  11-item survey on evaluation involvement in multi-site evaluations  2 factors: planning and implementation Improved understanding of what involvement looks like in multi-site evaluations  Data collection  Attendance at meetings Facilitates future research stakeholder involvement  Positively correlated to evaluation use  Needs to be validated in different evaluation contexts B. Volkov, S. Toal, F. Lawrenz, L. Greenseid, K. Johnson, & J. King