We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byDarlene Ashurst
Modified about 1 year ago
Leveraging Competing and Complementary Roles for Success in R & D Mid-continent Research for Education and Learning Sheila A. Arens, Helen Apthorp, Zoe Barley, LeAnn M. Gamache © 2005
2 Questions to Ponder… What are the bridges evaluators must cross in the R & D world? What are the roles at play in the R & D continuum? How can evaluators respond to different (sometimes conflicting) expectations around issues of the validity of evidence? © 2005
Creating a Coherence with Language of an R&D Continuum LeAnn M. Gamache, PhD McREL © 2005
4 The Players Sponsors Program Evaluation Program Developers Implementers Evaluators Participants and Constituents Researchers Others © 2005
5 Purposes for a Continuum Enable common language Help to highlight project priorities Reveal assumptions Focus planning discussions within context of total endeavor © 2005
6 Overview of an R&D Continuum Four Phases for R&D Endeavor Need and Approach Model and Instrumentation Development and Pilot-Testing Broad Dissemination and Implementation © 2005
7 Stages within the Four Phases: Need and Approach © 2005
8 Stages within the Four Phases: Model and Instrumentation © 2005
9 Stages within the Four Phases: Development and Pilot-Testing © 2005
10 Stages within the Four Phases: Dissemination and Implementation © 2005
11 Roles and Discussions at Critical Junctures within Stages During Development At Implementation At Evaluation © 2005
12 The Context for Evaluation in an R & D World Zoe A. Barley McREL © 2005
13 Three Phases Development Implementation Production © 2005
14 Four Roles : Conceptualizer Implementer/Practitioner Funder Evaluator © 2005
15 Interactions: Roles and Phases P H A S E S Development Implementation Production R Conceptualizer X O O O L Implementer O X O E S Funder O O X © 2005
16 Variables of Interest Level of Investment (LOI) Level of Astuteness (LOA) Evidentiary Requirements (ER) © 2005
17 The Development Phase The Conceptualizer LOIHigh LOAHigh ERIs it true to theory? The Practitioner – Is it doable in the real world? The Funder – is it marketable/affordable? © 2005
18 The Implementation Phase The Practitioner/Implementer LOI High LOAHigh ERWill it work in context? The Conceptualizer – Can it stand the adaptations? The Funder – What is the market niche? © 2005
19 The Production Phase The Funder LOIHigh LOAHigh ERWill sales support it? The Conceptualizer – Is it still theory based? The Implementer – Will it make a difference? © 2005
20 What happens when - - The Conceptualizer is in charge The Reluctant Genius © 2005
21 What happens when - - The Implementer is in charge The Passionate Reformer © 2005
22 What happens when - - The Funder is in charge The Bottom Liner © 2005
23 Validity Concerns Reconciled? Client versus Evaluator Evidentiary Expectations Sheila A. Arens McREL © 2005
24 Concerns about the quality of evidence and claims underlie all social science Such concerns have been punctuated by the increased interest in evidence-based inquiry and evidence-based practice… Overview © 2005
25 Emerging Needs, Differing Perspectives Increased pressure on Practitioners to select and engage in only those practices that are evidence-based, elevates considerations of what constitutes “evidence” and “evidence based” Increased pressure on R&D Organizations to collect evidence for their products and services to satisfy practitioner requirements © 2005
26 Varied Perspectives on Validity There exist varied perspectives regarding how external readers approach or engage in evaluation documents and varied responses from evaluation community regarding how to appropriately deal with this © 2005
27 House 1985: Decisions about the data to collect are intertwined with prospectively considering the rhetorical power of statements one wishes to issue relative to audience... …regardless of the veracity of the claim(s) being made, evaluators must attend to audience — if evaluation fails to provide audience with acceptable explanation / fails to enhance understanding of some phenomenon, findings may not be considered adequate Thus, persuasion plays a role in evaluative claims and the perceived validity of the inferences and the extent to which the evaluator herself is able to craft a compelling rhetorical argument is partially a product of audience. Validity is therefore not merely about reaching “true” assertions. © 2005
28 Patton 2002: goal of ensuring evaluative validity should not be to reach technical standards but rather, to determine whether appropriate methods and measures have been utilized for the particular evaluation purpose(s) and relative to the intended users of the evaluation findings © 2005
29 Lincoln 2003: validity is not simply a matter of determining which data collection efforts lead to better information "…but rather, which kinds of evidence will best address certain questions, and, at a foundational level, which kinds of literary-rhetorical devices are being employed, and which kinds of symbolic-interpretive processes are being brought to bear in the mounting of a persuasive argument?" (italics in original). © 2005
30 Cases Several illustrative cases to highlight differences in evidentiary expectations These emerged both among various stakeholders and between stakeholders and evaluators © 2005
31 Up the Ladder Context: A state department of education State interested in documenting accountability for the state funding of teacher professional development Participants in the experience expressed interest in “telling their stories,” and resisted the state department data collection efforts © 2005
32 Ready, Set…Ready, Set… Context: Proprietor of online professional development courses While proprietor interested in collecting evidence of success of product, timing issues (evaluability) precluded the collection of meaningful data In rush to advance evidence that program “works,” organization began to inappropriately utilize student data to make claims about program efficacy © 2005
33 Hurry, hurry! Context: Textbook publisher interested in examining curricular materials with eye on textbook adoption While product development pressed to “rush to market,” organization continued to stress need for the collection of “rigorous evidence” © 2005
34 All That Glitters Context: State department and University interested in outcomes of systemic school reform model State department interested in “bottom line accountability” student achievement Participants interested in having their stories heard and having individual school successes and obstacles documented through case studies. © 2005
35 Evaluator Responsibilities What are the responsibilities of the evaluator regarding evidence? How does the evaluator navigate between competing demands for evidence? At what point does the evaluator need to intervene with client(s) to ensure that claims being made are adequately supported? © 2005
Geniuses, Bottom Liners & Chameleons: Complementary and Varying Roles in Education R&D Helen S. Apthorp, PhD McREL © 2005
37 Three Stories across the Phases of Education R&D Production Development Implementation Cycling Back into the Future The Passionate Reformer (What happens when the implementer is in charge) Juggling Multiple New Roles The Reluctant Genius (What happens when the conceptualizer is in charge) Crossing into the Future The Bottom Liner (What happens when the funder is in charge) © 2005
38 The Passionate Reformer What happens when the implementer is in charge? “We know it works.” Evidence is not necessary © 2005
39 The Reluctant Genius What happens when the conceptualizer is in charge? Intervention is often ill-defined. Moves ahead, can’t wait for feedback. © 2005
40 The Bottom Liner What happens when the funder is in charge? Marketability is priority Being savvy reigns Agreements and obligations become real © 2005
41 Cycling back and into the Future Find the bridges between What clients want to know and what they ought to know Study design, method, and audience © 2005
42 Juggling Multiple New Roles How not to be fickle Reject the chameleon Use professional authority © 2005
43 Crossing Boundaries into the Future Serve the needs of a broad base of stakeholders to protect against bias Anticipate informational needs Preserve credibility while remaining flexible © 2005
44 Contact Information Sheila A. Arens, firstname.lastname@example.org@mcrel.org Zoe Barley, email@example.com@mcrel.org LeAnn M. Gamache, firstname.lastname@example.org email@example.com Helen S. Apthorp, firstname.lastname@example.org@mcrel.org © 2005
EVIDENCED-BASED ADVOCACY: WHAT IT IS? STEPS TOWARDS BUILDING THE ADVOCACY PLAN From Research to Advocacy.
Introduction to Program Evaluation Anne Powers, PhD Battelle Centers for Public Health Research and Evaluation.
Designing an Effective Evaluation Strategy presenter : Sharon Schnelle, Ph.D. Sponsored through.
Communication Degree Program Outcomes BSC-1 Theory: Understand communication theory and how its basic elements and principles apply to various types.
Helping Teachers Help All Students: The Imperative for High-Quality Professional Development Report of the Maryland Teacher Professional Development Advisory.
MYRNA B. ABALOS Reporter. Evaluation in education is essentially concerned with 2 major roles: Product Evaluation Process Evaluation.
Evaluation. Practical Evaluation Michael Quinn Patton.
CCTC Background Process coordinated by NASDCTEc 42 states, DC, and one territory involved in development Modeled the process and outcomes of Common Core.
PPA 502 – Program Evaluation Lecture 1 – Introduction, Evaluability Assessment.
Providing On-going Support for STEM Teachers Joan D. Pasley Horizon Research, Inc.
Narrowing the Challenge: Revisiting Understanding by Design Cherie McCollough VaNTH-PER Professional Development June 1, 2004.
1 Getting to the Core of the Common Core State Standards What they are! & How they came to be! Implications for Policy and Practice Advanced Literacy Panel.
Translating Knowledge to Action (K2A): An Organizing Framework and A Planning Tool Teresa J. Brady, PhD On behalf of the NCCDPHP Work Group on Translation.
Drafting Legislation and Policy Achieving Equal Employment Opportunities for People with Disabilities through Legislation An Education and Training Guide.
Introducing the Next Generation Science Standards (NGSS)
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
INACOL National Standards for Quality Online Teaching, Version 2.
An Evidence-Based Practice Approach to School Counselor Education John Carey, Carey Dimmitt, Natalie Kosine National Center for School Counseling Outcome.
Qualitative Research January 19, Selecting A Topic Trying to be original while balancing need to be realistic—so you can master a reasonable amount.
Paper III Qualitative research methodology. Objective 1.3 To what extent can findings be generalized from qualitative studies.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
University of Wisconsin - Extension, Cooperative Extension, Program Development and Evaluation Unit 3: Engaging stakeholders.
Improving Secondary Education and Transition Using Research-Based Standards and Indicators An initiative of the National Alliance on Secondary Education.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Non-governmental Actors in the Compliance with and Monitoring of Multilateral Environmental Decisions.
Increasing Evaluation Transparency: A Dialogue Strategy Sheila A. Arens June, 2003.
So far, so good – what more, what next? Christine Stephen University of Stirling, Scotland.
Project Management Initiation Minder Chen, Ph.D. CSU Channel Islands
Bridging the Research to Practice Gap: Perspectives from the Practice Side Ronnie Detrich Wing Institute.
A GENERIC PROCESS FOR REQUIREMENTS ENGINEERING Chapter 2 1 These slides are prepared by Enas Naffar to be used in Software requirements course - Philadelphia.
International engagement: m & e meeting Monitoring & Evaluation: an introduction for practitioners Liz Allen.
KATEWINTEREVALUATION.com Education Research 101 A Beginner’s Guide for S STEM Principal Investigators.
PLANNING YOUR APPROACH: THE MANAGEMENT COMPONENT OF CPS.
Goal Setting and Continuous Improvement. What will be the goals you set that make a difference for your customers? What role will you play? With.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
1 Identifying System Requirements. 2 Agenda Identifying System Requirements –Stakeholder Needs –Features Project Scope Stakeholder Classifications.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Performance Stories Evaluation - A Monitoring Method to Enhance Evaluation Influence Riad Naji, Catriona King, Richard Habgood.
Advances in Human Resource Development and Management Course Code: MGT 712 Lecture 28.
Putting Professional Ethics into research and practice BASW.
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk –
Educational Research: Action Research in Schools EDU 8603 Educational Research Richard M. Jacobs, OSA, Ph.D.
Effectiveness as a challenge for development Dr. Maria Mousmouti, Institute of Advanced Legal Studies/ Centre for European Constitutional Law Urban Law.
Common Core State Standards in English/Language Arts What science teachers need to know.
Interpretative Theories BASIC IDEAS The social world is a world made up of purposeful actors who acquire, share, and interpret a set of meanings, rules,
Considering the Roles of Research and Evaluation in DR K-12 Projects December 3, 2010.
Chapter 12 Identifying and Selecting the Evaluation Questions and Criteria OST Certificate Program Program Evaluation Course Fitzpatrick, Sanders, and.
1 ACC 3303: AUDITING 2 Assurance Services ?? Need for Assurance ? Illustration using an Audit Engagement as an example.
RESEARCH DESIGN. Design, at a basic level, means planning. Generally some decisions are to be taken before the actual action. The design is a plan to.
© 2017 SlidePlayer.com Inc. All rights reserved.