Unit 8.  Program improvement or appraisal  Assessing the value of a program  Measuring the efficacy of particular components of a program  Meeting.

Slides:



Advertisements
Similar presentations
Agenda Group Hypotheses Validity of Inferences from Research Inferences and Errors Types of Validity Threats to Validity.
Advertisements

Experimental Design I. Definition of Experimental Design
Defining Characteristics
Experimental Research Designs
Needs Assessment and Program Evaluation. Needs Assessment is: A type of applied research. Data is collected for a purpose! Can be either a descriptive.
Correlation AND EXPERIMENTAL DESIGN
Research Design and Validity Threats
Educational Action Research Todd Twyman Summer 2011 Week 1.
Business Research for Decision Making Sixth Edition by Duane Davis Chapter 6 Fundamentals of Research Design PowerPoint Slides for the Instructor’s.
Common Designs and Quality Issues in Quantitative Research Research Methods and Statistics.
Lecture 10 Psyc 300A. Types of Experiments Between-Subjects (or Between- Participants) Design –Different subjects are assigned to each level of the IV.
Educational Research by John W. Creswell. Copyright © 2002 by Pearson Education. All rights reserved. Slide 1 Chapter 11 Experimental and Quasi-experimental.
Aaker, Kumar, Day Seventh Edition Instructor’s Presentation Slides
Chapter 9 Experimental Research Gay, Mills, and Airasian
L1 Chapter 11 Experimental and Quasi- experimental Designs Dr. Bill Bauer.
BC Jung A Brief Introduction to Epidemiology - XI (Epidemiologic Research Designs: Experimental/Interventional Studies) Betty C. Jung, RN, MPH, CHES.
Chapter 8 Experimental Research
Experimental Design The Gold Standard?.
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
Statistical Analyses & Threats to Validity
Types of Experiments & Research Designs UAPP 702: Research Design for Urban & Public Policy Based on notes from Steven W. Peuquet. Ph.D.
Research Methods in Psychology
Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition, or past practice. The importance.
Group Discussion Explain the difference between assignment bias and selection bias. Which one is a threat to internal validity and which is a threat to.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Day 6: Non-Experimental & Experimental Design
Chapter 11 Experimental Designs
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 11 Experimental Designs.
Chapter 8 Experimental Design: Dependent Groups and Mixed Groups Designs.
Copyright ©2008 by Pearson Education, Inc. Pearson Prentice Hall Upper Saddle River, NJ Foundations of Nursing Research, 5e By Rose Marie Nieswiadomy.
Techniques of research control: -Extraneous variables (confounding) are: The variables which could have an unwanted effect on the dependent variable under.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Enhancing Rigor in Quantitative Research.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Understanding Research Design Can have confusing terms Research Methodology The entire process from question to analysis Research Design Clearly defined.
Quantitative and Qualitative Approaches
1 Experimental Research Cause + Effect Manipulation Control.
Experimental Design Chapter 1 Research Strategies and the Control of Nuisance Variables.
Research Methods in Psychology Chapter 2. The Research ProcessPsychological MeasurementEthical Issues in Human and Animal ResearchBecoming a Critical.
Chapter 10 Finding Relationships Among Variables: Non-Experimental Research.
Experimental Research Methods in Language Learning Chapter 5 Validity in Experimental Research.
1 Module 3 Designs. 2 Family Health Project: Exercise Review Discuss the Family Health Case and these questions. Consider how gender issues influence.
Experimental & Quasi-Experimental Designs Dr. Guerette.
Quasi-Experimental Designs Slides Prepared by Alison L. O’Malley Passer Chapter 11.
SOCW 671: #6 Research Designs Review for 1 st Quiz.
Evidence-Based Practice Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition,
Chapter Eight: Quantitative Methods
EXPERIMENTAL DESIGNS. Categories Lab experiments –Experiments done in artificial or contrived environment Field experiments –Experiments done in natural.
Introduction to Validity True Experiment – searching for causality What effect does the I.V. have on the D.V. Correlation Design – searching for an association.
Experimental Research Design Causality & Validity Threats to Validity –Construct (particular to experiments) –Internal –External – already discussed.
Research Design Quantitative. Quantitative Research Design Quantitative Research is the cornerstone of evidence-based practice It provides the knowledge.
Research design By Dr.Ali Almesrawi asst. professor Ph.D.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
Can you hear me now? Keeping threats to validity from muffling assessment messages Maureen Donohue-Smith, Ph.D., RN Elmira College.
School of Public Administration & Policy Dr. Kaifeng Yang 研究设计 : 实验研究的基本问题.
Educational Research Experimental Research Chapter 9 (8 th Edition) Chapter 13 (7 th Edition) Gay and Airasian.
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
Chapter 9 Scrutinizing Quantitative Research Design.
Issues in Evaluating Educational Research
Chapter 11: Quasi-Experimental and Single Case Experimental Designs
Research Methods: Concepts and Connections First Edition
Introduction to Design
Quantitative Research
Threats to Internal Validity
Experiments: Validity, Reliability and Other Design Considerations
Experiments: Part 2.
Group Experimental Design
Research Design Quantitative.
Experimental Design I. Definition of Experimental Design
Chapter 11 EDPR 7521 Dr. Kakali Bhattacharya
RES 500 Academic Writing and Research Skills
Presentation transcript:

Unit 8

 Program improvement or appraisal  Assessing the value of a program  Measuring the efficacy of particular components of a program  Meeting accountability requirements.  Data used for decisions about whether to maintain a program, advance it, introduce comparable programs elsewhere, allocate resources among rival programs, or accept or reject a program approach or hypothesis.

 Head Start or Title I – US Dept of Education  Math curriculum in one district  Four Questions of Program Evaluation (Posavac & Carey, 1997) ◦ Needs Is an agency or organization meeting the needs of the people it serves? ◦ Process How is a program being implemented (is it going as planned)? ◦ Outcome Has a program been effective in meeting its stated goals? ◦ Efficiency Is a program cost-efficient relative to alternative programs?

 Evaluators use many of the same qualitative and quantitative methodologies used by researchers in other fields.  Primary purpose of evaluation is to provide information for decision-making about particular programs, not to advance more wide-ranging knowledge or theory. ◦ Evaluation is more client-focused than traditional research, in that evaluators work closely with program staff to create and carry-out an evaluation plan that attend to the particular needs of their program.

 Evaluation serves to aid in a program's development, execution, and improvement by examining its processes and/or results  Assessment measures individuals or group's performances by measuring their skill level on a variable of interest (e.g., reading comprehension, math or social skills).

 An experiment is any study in which a treatment is introduced. ◦ A new method of teaching, different behavioral intervention,  A non-experimental study does not introduce a treatment. ◦ Comparing opinions from natural groups

 Any study in which a treatment is introduced is an experiment.  Control: Researchers investigate the effect of various factors one at a time in an experiment.  An experiment has at least one independent variable and at least one dependent variable.  A true experiment involves random assignment of participants to treatment groups.

 An intervention or a treatment is implemented.  True experiments have a control group ◦ Two groups have the same tx, except for the independent variable of interest.  In true experiments, confounding variables are well controlled by the experimenter. ◦ Random assignment

 Experimental Group: group receiving treatment  Control Group: group not receiving treatment ◦ Represents expected results for experimental group if no treatment is given ◦ Represents population before treatment or if no treatment.

 Concerns about usefulness of results ◦ School board presidents and government and business leaders are hesitant to allow “poking around”.  Access to participants ◦ “Wait list” ◦ Random assignment

 Potential causes for a research finding.  Researchers must rule out these alternative explanations.  Eight confound categories - “threats to internal validity”: ◦ history ◦ maturation ◦ testing ◦ instrumentation ◦ regression ◦ subject attrition (mortality) ◦ selection ◦ interactions with selection

 When there is no comparison group in the study, the following threats to internal validity must be considered: ◦ history, maturation, testing, instrumentation, regression, subject mortality, selection  When a comparison group is added, the following threats to internal validity must be considered: ◦ selection, interactions with selection

 Because of contamination, expectancy effects, and novelty effects, researchers may have difficulty concluding whether a treatment was effective.

 Contamination: Happens due to communication about the experiment between groups of participants.  Three possible outcomes of contamination: ◦ resentment: some participants’ performance may worsen because they resent being in a less desirable condition; ◦ rivalry: participants in a less desirable condition may boost their performance so they don’t look bad; and ◦ diffusion of treatments: control participants learn about a treatment and apply it to themselves.

 Expectancy Effects: researcher unintentionally influences the results of an experiment. ◦ Researchers can make systematic errors in their interpretation of participants’ performance based on their expectations. ◦ Researchers can make errors in recording data based on their expectations for participants’ performance.

 Novelty Effects: This refers to changes in people’s behavior simply because an innovation (e.g., a treatment) produces excitement, energy, and enthusiasm ◦ Hawthorne effect: performance changes when people know “significant others” (e.g., researchers, company bosses) are interested in them or care about their living or work conditions.

 GOVERNMENT WARNING: ◦ (1) According to the Surgeon General, women should not drink alcoholic beverages during pregnancy because of the risk of birth defects. ◦ (2) Consumption of alcoholic beverages impairs your ability to drive a car or operate machinery, and may cause health problems.  The U.S. National Institute on Alcohol Abuse and Alcoholism funded the Alcohol Research Group to conduct a series of cross-sectional surveys in the United States and Ontario, Canada (Greenfield et al., 1999).