How do you know your product “works”? And what does it mean for a product to “work”? www.SEGMeasurement.com.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Presented by: Tom Chapel Focus On…Thinking About Design.
Catalyzing Teaching: The Scholarship of Teaching and Learning (SoTL) Regan A. R. Gurung A Brief Introduction adapted from a talk given at the 2006 American.
TEACHER EVALUATION IN NEW JERSEY: PREVIEW OF FINAL REPORT RUTGERS GRADUATE SCHOOL OF EDUCATION.
Chapter 2: The Research Process
Chapter 15 Evaluation Recognizing Success. Social Work Evaluation and Research Historically –Paramount to the work of early social work pioneers Currently.
Kotler / Armstrong, Chapter 4
An Assessment Primer Fall 2007 Click here to begin.
Introduction to Qualitative Research
Evaluating and Revising the Physical Education Instructional Program.
Title I Needs Assessment and Program Evaluation
Validity 4 For research to be considered valid, it must be based on fact or evidence; that is, it must be "capable of being justified."
Objectives for Session Nine Observation Techniques Participatory Methods in Tanzania Hand back memos.
Methodology Tips for Constructing Instruments. Matching methods to research paradigm MethodQuantitativeQualitative Written Instrument Standardized Instrument.
Formulating the research design
Planning and Conducting Scientific Investigations.
Assessing Financial Education: A Practitioner’s Guide December 2010.
Designing Learning.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 4 Gathering Data Section 4.1 Experimental and Observational Studies.
Research Design for Quantitative Studies
Learning Objectives 1 Copyright © 2002 South-Western/Thomson Learning Primary Data Collection: Experimentation CHAPTER eight.
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Assessment with Children Chapter 1. Overview of Assessment with Children Multiple Informants – Child, parents, other family, teachers – Necessary for.
Making the most of it: Assessing impact with small data sets Susanna Dilliplane, PhD Deputy Director Aspen Planning and Evaluation Program ANDE Annual.
Research Methodology For IB Psychology Students. Empirical Investigation The collecting of objective information firsthand, by making careful measurements.
Research Policies and Mechanisms: Key Points from the National Mathematics Advisory Panel Joan Ferrini-Mundy Director, Division of Research on Learning.
Evaluating a Research Report
Market Research Lesson 6. Objectives Outline the five major steps in the market research process Describe how surveys can be used to learn about customer.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Evaluation Tools & Approaches for Engaging Stakeholders: The AREA Approach Presentation at the Canadian Evaluation Society Dr. Mariam Manley, Principal.
Conducting Research: The Scientific Methods Hyde, A. (2007). Adapted from Henlsin, J. (2005) Sociology: A Down to Earth Approach 7/e. New York: Allyn &
The Literature Search and Background of the Problem.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
Introduction to research methods 10/26/2004 Xiangming Mu.
Final Reports from the Measures of Effective Teaching Project Tom Kane Harvard University Steve Cantrell, Bill & Melinda Gates Foundation.
Quantitative and Qualitative Approaches
Copyright 2000 Prentice Hall5-1 Chapter 5 Marketing Information and Research: Analyzing the Business Environment.
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
USING DATA TO SUPPORT YOUR POSITION ON THE LEGALIZATION OF MARIJUANA.
Hafa Adai Student Learning and Assessment Welcome Dr. Julie M. Ulloa-Heath.
Classifying Designs of MSP Evaluations Lessons Learned and Recommendations Barbara E. Lovitts June 11, 2008.
Issues in Validity and Reliability Conducting Educational Research Chapter 4 Presented by: Vanessa Colón.
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
What you need to know about changes in state requirements for Teval plans.
How can giving ELL students access to learning games on a computer help them learn in the classroom? By: Lisa Cruz.
Experimental & Quasi-Experimental Designs Dr. Guerette.
APPROACHES TO DATA COLLECTION & ANALYSIS
1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.
Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky
The effects of Peer Pressure, Living Standards and Gender on Underage Drinking Psychologist- Kanari zukoshi.
Quantitative research Meeting 7. Research method is the most concrete and specific part of the proposal.
4.1 Statistics Notes Should We Experiment or Should We Merely Observe?
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Welcome to the (ENTER YOUR SYSTEM/SCHOOL NAME) Data Dig 1.
Session 3 Overview of Research Designs Introduction to Research and Evaluation in Education.
4 th Biennial SAMEA Conference Meaningful Evaluation: Improving Use and Results Evaluation of Teacher-Directed ICT Initiatives Presenter Mokete Mokone.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
RES 320 expert Expect Success/res320expertdotcom FOR MORE CLASSES VISIT
School of Public Administration & Policy Dr. Kaifeng Yang 研究设计 : 实验研究的基本问题.
Issues in Evaluating Educational Research
Part Two.
2007 Article VII # ELFA 8 Education, Labor, and Family Assistance
ISTE Workshop Research Methods in Educational Technology
Group Experimental Design
Instructions for Using the Program Evaluation Toolkit
REVIEW UNIT 1 BIOLOGY.
An Introduction to Evaluating Federal Title Funding
Presentation transcript:

How do you know your product “works”? And what does it mean for a product to “work”?

Purposes and Uses of Evidence  Common Uses of Evaluation Evidence  In Process Product Design and Modification  Gathering “proof” that the product works  Understanding how teachers and students use the product  Complying with grant or foundation giving requirements  Supporting product adoption decisions from piloting  Understand your purpose for collecting information at the beginning.  Design your evidence collection in a way that is consistent with those goals. SEG Measurement

Sources of Evidence and Ways to Collect It  Sources of Evidence  Students  Teachers  Administrators  Parents  Experts or other third parties  How to collect evidence  Pre/Post Testing (Cognitive and Affective  Anecdotes  Surveys  Interviews  Observation  Focus Groups  Product Usage Data  Amount its used  How its used SEG Measurement

Are you looking for “proof” that use of your product is the “cause” of improved learning outcomes?  “Proof” and “Causality” (The scientific model)  The “Medical model”  If we want to know if a drug works, we know you can’t just ask a bunch of people.  Random Assignment to treatment and control groups or matched groups designs  Pre and post testing  Sound statistical models to eliminate any potential initial differences or “interfering” variables SEG Measurement

Challenges  Getting results quickly  Desire for rapid cycle, iterative improvement  Costs (One good adoption won pays the bill)  Lack of Research Literacy (We offer pro bono assistance) SEG Measurement

How to prove your product is effective  Treatment and Control Group  Sufficient Sample Size  Statistical Power  Generalizability  Pre post assessment of desired outcomes (external, reliable and valid)  Fidelity of implementation  Qualitative and quantitative data  Third party conducted  Peer reviewed  Multiple studies SEG Measurement