Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 22 Evaluating a Research Report Gay, Mills, and Airasian

Similar presentations


Presentation on theme: "Chapter 22 Evaluating a Research Report Gay, Mills, and Airasian"— Presentation transcript:

1 Chapter 22 Evaluating a Research Report Gay, Mills, and Airasian
Educational Research Chapter 22 Evaluating a Research Report Gay, Mills, and Airasian

2 Topics Discussed in this Chapter
Gathering information General evaluation criteria Design specific evaluation criteria Qualitative research in general Observational research Historical research Survey – Questionnaire and Interview Correlational – Relationship and Prediction Causal-Comparative Experimental

3 Gathering Information
Necessity of knowing what was done Examples What was the problem? Who were the subjects? What research design was used? What were the results and conclusions? What are the implications of the research? Basic formats to collect information for quantitative and qualitative research

4 Gathering Information - Quantitative
Introduction Problem Provide a general statement of the problem that includes the variables and the relationships between them State the importance of the study Review of the Literature List the major issues identified in the review Hypothesis State the specific hypothesis or hypotheses being investigated Objectives 1.1, 1.2, & 1.3

5 Gathering Information - Quantitative
Method Participants Identify the population and sample Describe the sampling and/or assignment procedures Identify the size of the total sample and of each group if applicable Describe the general characteristics of the subjects Objective 1.4

6 Gathering Information - Quantitative
Method (continued) Instruments List the specific instruments used in the study Describe the evidence of validity provided for each instrument Describe the reliability evidence cited for each instrument Describe the information needed to interpret the scores for each instrument Objective 1.5

7 Gathering Information - Quantitative
Method (continued) Design and Procedures Identify the specific type of research design Identify any threats to internal validity Identify any threats to external validity Results Identify the specific analyses being used A comparison between the mean scores for a control and experimental group A correlation between students’ math attitudes and achievement A survey of parental attitudes toward an extended school year Objective 1.6

8 Gathering Information - Quantitative
Method (continued) Results (continued) Identify any descriptive statistics used and summarize the results Identify the specific statistical test of significance, report the test statistic itself, and report its level of significance The experimental group means were significantly higher (t = 5.68, p = .023) than those for the control group There was a significant (t = 14.91, p = .001) positive relationship between students’ attitudes and achievement Objective 1.7

9 Gathering Information - Quantitative
Discussion Identify the specific conclusions of the researchers Discuss the implications described by the researchers Objective 1.8

10 Gathering Information – Qualitative
Introduction Research topic Provide a statement of the general issue, topic, or question being investigated Describe any reformulation of the topic on the basis of the ongoing interactive nature of the collection, analysis, and synthesis of data Discuss the importance of the topic Objective 1.1

11 Gathering Information – Qualitative
Introduction (continued) Review of the literature Describe the nature of the review of the literature List the major issues identified in the review of the literature Objective 1.2

12 Gathering Information – Qualitative
Method Site and participant selection Describe the strategies used to gain entry to the site Describe the site Identify the participant(s) and list the sampling strategies used to select them Describe the characteristics of the participant(s) Objective 1.4

13 Gathering Information – Qualitative
Method (continued) Data collection and analysis Describe the researcher’s role in the study Report the data collection strategies used Identify any instruments or protocols used by the researchers Identify any threats to the quality of the data (i.e., observer bias and observer effect) Describe the strategies used to enhance validity and reduce bias in data collection Describe the strategies used to classify and interpret data Objectives 1.5 & 1.7

14 Gathering Information – Qualitative
Method (continued) Research approach and procedures Identify the research approach Briefly describe the procedures used Identify any ethical issues related to the study Results Report the findings Describe the researcher’s interpretation of the findings Objectives 1.6 & 1.7

15 Gathering Information – Qualitative
Discussion Report the researcher’s conclusions State the relationship between the conclusions and the initial problem Objective 1.8

16 Focus of General Evaluation Criteria
See the evaluation criteria in the text and on the web site Introduction Problem Review of the related literature Hypotheses Methods Participants Instruments Research design and procedures Objectives 1.1 – 1.8

17 Focus of General Evaluation Criteria
Results Discussion Abstract or summary Objectives 1.1 – 1.8

18 Type-Specific Evaluation Criteria
Descriptive research Questionnaire studies Are questionnaire validation procedures described? Was the questionnaire pretested? Are pilot study procedures and results described? Are directions to questionnaire respondents clear? Does each item in the questionnaire relate to one of the objectives of the study? Does each questionnaire item deal with a single concept? When necessary, is a point of reference given for questionnaire scales? Objective 2.1

19 Type-Specific Evaluation Criteria
Descriptive research (continued) Questionnaire studies (continued) Are leading questions avoided in the questionnaire? Are there sufficient alternatives for each questionnaire item? Does the cover letter explain the purpose and importance of the study and give the potential respondent a good reason to co-operate? If appropriate, is confidentiality or anonymity assured in the cover letter? Objective 2.1

20 Type-Specific Evaluation Criteria
Descriptive research (continued) Questionnaire studies (continued) What is the percentage of returns and how does this affect the study results? Are follow-up activities to increase returns described? If the response rate was low, was any attempt made to determine any major differences between respondents and non-respondents? Are data analyzed in groups or clusters rather than a series of many single variable analyses? Objective 2.1

21 Type-Specific Evaluation Criteria
Correlational research Relationships Were variables carefully selected? Is the rationale for variable selection described? Are conclusions and recommendations based on values of correlation coefficients corrected for attenuation or restriction in range? Do the conclusions avoid suggesting causal relationships between variables? Objective 2.2

22 Type-Specific Evaluation Criteria
Correlational research Prediction Is a rationale given for selection of predictor variables? Is the criterion variable well defined? Was the resulting prediction equation validated with at least one other group? Objective 2.2

23 Type-Specific Evaluation Criteria
Causal-comparative research Are the characteristics or experiences that differentiate the groups clearly defined or described? Are critical extraneous variables identified? Were any control procedures applied to equate the groups on extraneous variables? Are causal relationships that were identified discussed with due caution? Are plausible alternative hypotheses discussed? Objective 2.3

24 Type-Specific Evaluation Criteria
Experimental research Was an appropriate experimental design selected? Is a rationale given for the design selected? Are sources of invalidity associated with the design identified and discussed? Is the method of group formation described? Was the experimental group formed in the same way as the control group? Were groups randomly formed and the use of existing groups avoided? Objective 2.4

25 Type-Specific Evaluation Criteria
Experimental research (continued) Were treatments randomly assigned to groups? Were critical extraneous variables identified? Were any control procedures applied to equate groups on extraneous variables? Were possible reactive arrangements controlled? Were tables clear and pertinent to the research results? Were the results generalized to the appropriate group? Objective 2.4

26 Type-Specific Evaluation Criteria
Single-subject research Are the data time constrained? Is a baseline established prior to moving into the intervention phase? Is the length of the treatment sufficient to represent the behavior within the phase? Is the design appropriate to the question being asked? Objective 2.5

27 Type-Specific Evaluation Criteria
Single-subject research (continued) If a multiple-baseline design is used, are conditions met to move across baselines? If a withdrawal design is used, are limitations to this design addressed? Does the researcher manipulate only one variable at a time? Is the study replicable? Objective 2.5

28 Type-Specific Evaluation Criteria
Interview studies Were the interview procedures pretested? Are pilot study procedures and results described? Does each item in the interview guide relate to a specific objective of the study? When necessary, is a point of reference given in the guide for interview items? Are leading questions avoided in the interview guide? Is the language and complexity of the questions appropriate for the participants? Objectives 2.6 – 2.9

29 Type-Specific Evaluation Criteria
Interview studies (continued) Does the interview guide indicate the type and amount of prompting and probing that was permitted? Are the qualifications and special training of the interviewers described? Is the method used to record responses described? Did the researcher use the most reliable, unbiased method of recording responses? Did the researcher specify how the responses to semi-structured and unstructured items were quantified? Objectives 2.6 – 2.9

30 Type-Specific Evaluation Criteria
Narrative Research Did the researcher provide a rationale for the use of narrative research? Is there a rationale for the selection of individuals to study the chosen phenomenon? Did the researcher describe data collection methods with particular attention to interviewing? Did the researcher describe appropriate strategies for analysis and interpretation? Objective 2.6

31 Type-Specific Evaluation Criteria
Ethnographic research Did the written account (i.e., the ethnography) capture the social, cultural, and economic themes that emerged from the study? Did the researcher spend a “full cycle” in the field studying the phenomenon? Mixed methods research Does the study use at least one quantitative and one qualitative research method? Does the study include a rationale for using a mixed methods research design? Objectives 2.7 & 2.8

32 Type-Specific Evaluation Criteria
Mixed methods research (continued) Does the study include a classification of the type of mixed methods research design? Was the study feasible given the amount of data to be collected and concomitant issues of resources, time, and expertise? Does the study include both quantitative and qualitative research questions? Does the study clearly identify qualitative and quantitative data collection techniques? Does the study use appropriate data analysis techniques for the type of mixed methods design? Objective 2.8

33 Type-Specific Evaluation Criteria
Action research Did the teacher’s area of focus involve teaching and learning and focus on the teacher’s own practice? Did the teacher state questions that were answerable given his or her expertise, time and resources? Was the area of focus within the teacher’s locus of control? Was the area of focus something about which the teacher was passionate? Objective 2.9

34 Type-Specific Evaluation Criteria
Action research (continued) Was the area of focus something the researcher wanted to change or improve? Did the teacher provide an action plan detailing the impact of the research findings on practice? Objective 2.9

35 Validity and Reliability
Threats to internal validity in qualitative studies Did the researcher effectively deal with problems of history and maturation by documenting historical changes over time? Did the researcher effectively deal with problems of mortality by using a large enough sample? Was the researcher in the field long enough to effectively minimize observer effects? Did the researcher take the time to become familiar and comfortable with participants? Objective 2.10

36 Validity and Reliability
Threats to internal validity in qualitative studies (continued) Were the interview questions pretested? Were efforts made to ensure intra-observer agreement by training interview teams in coding procedures? Were efforts made to cross-check results by conducting interviews with multiple groups? Did the researcher interview key informants to verify field observations? Were participants demographically screened to ensure that they were representative of the larger population? Objective 2.10

37 Validity and Reliability
Threats to internal validity in qualitative studies (continued) Was the data collected using different media to facilitate cross-validation? Were participants allowed to evaluate the researcher results before publication? Is sufficient data presented to support findings and conclusions? Were dependent and independent variables repeatedly tested to validate results? Objective 2.10

38 Validity and Reliability
Threats to external validity in qualitative studies Were construct effects addressed adequately? Were both new and adapted instruments pretested to ensure they were appropriate for the study? Did the researcher fully describe participants’ relevant characteristics? Does the report address researcher interaction effects by documenting the researcher’s activities? Were all observations and interviews conducted in a variety of fully described settings with multiple trained observers? Objective 2.10

39 Validity and Reliability
Threats to reliability in qualitative studies (continued) Is the researcher’s relationship with the group and setting fully described? Is all field documentation comprehensive, fully cross-referenced and annotated, and rigorously detailed? Were observations and interviews documented using multiple means? Is the interviewer’s training documented? Objective 2.10

40 Validity and Reliability
Threats to reliability in qualitative studies Is construction, planning, and testing of all instruments documented? Are key informants fully described, including information on groups they represent and their community status? Are sampling techniques fully documented as being sufficient for the study? Objective 2.10


Download ppt "Chapter 22 Evaluating a Research Report Gay, Mills, and Airasian"

Similar presentations


Ads by Google