Download presentation
Presentation is loading. Please wait.
Published byBryan Hunt Modified over 10 years ago
1
A Short Guide to Action Research 4 th Edition Andrew P. Johnson, Ph.D. Minnesota State University, Mankato www.OPDT-Johnson.com
2
Chapter 9: Evaluating, Describing, and Proposing Research
3
EVALUATING RESEARCH Research shows that … Scientists have found that … Studies indicate that … A recent report confirms that … These words alone are meaningless!
4
Buyer Beware 1. Saying so does not make it so 2. Research is different from doing a study 3. There is no such thing as the perfect research study 4. You cannot make research say anything you want.
5
5. Using research is like sighting in a target rifle 6. Show me the data
6
National Research Counsels (NRC), Principles of Scientific Research in Education 1. Pose significant questions that can be investigated empirically. 2. Link research to relevant theory. 3. Use methods that permit direct investigation of the question. 4. Provide a coherent and explicit chain of reasoning. 5. Replicate and generalize across studies. 6. Disclose research to encourage professional scrutiny and critique. National Research Council. (2002). Scientific research in education. Committee on Scientific Principles for Education Research. Shavelson, R. J., and Towne, L., Editors. Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press.
7
EVALUATING QUANTITATIVE RESEARCH
8
Independent and Dependent Variables A variable = item, characteristic, attribute, or factor that can be measured or observed Independent variable = the things that the researcher isolates or tries to make independent of all other forces Dependent variable = what happens as a result of the independent variable and is what is measured Confounding variables = independent variables that are unknown or have not been controlled and which affect the dependent variables
9
Common Confounding Variables Drop out or experimental mortality =subjects drop out while the experiment is in progress Concurrent events = simultaneously occurring events outside the experiment affect the results Unequal samples = groups are not similar in terms of size, demographics, ability, age, or other types of variables, Maturation = results are not due to any treatment or independent variable but to maturation
10
Sampling or selection bias = the types of subjects or participants included or excluded in a study affect the results Hawthorne Effect = participants in a study behave differently because they know they are in a study Validity of the measure = data collected do not determine what they are intended to measure
11
EVALUATING QUALITATIVE RESEARCH 1. Purposefulness. Purposefulness, questions, or area of study is clearly defined. 2. Assumptions and biases. Biases or assumptions are stated up front. 3. Rigor. A systematic plan is used to collect and analyze data. 4. Open-mindedness. The researcher dido not start with the answer.
12
5. Completeness. All relevant aspects of the environment are described in a way that enables understanding. 6. Coherence. The description makes sense. 7. Persuasiveness. Sound logic is used to analyze and interpret data. 8. Consensus. It is set in a sound theoretical context. 9. Usefulness. The findings enable you to better understand and more accurately interpret similar people, places, events, or experiences.
13
DESCRIBING RESEARCH 1. Question or purpose of the study 2. Subjects or participants 3. Treatment, criteria, or conditions 4. Measures or instruments. 5. Results 6. Conclusions or findings 7. Limitations
14
ACTION RESEARCH PROPOSAL 1. Title or heading 2. Topics 3. Question/s 4. Annotated bibliography 5. Methodology
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.