Recent public laws such as Individuals with Disabilities Education Improvement Act (IDEIA, 2004) and No Child Left Behind Act (NCLB,2002) aim to establish.

Slides:



Advertisements
Similar presentations
Project VIABLE: Behavioral Specificity and Wording Impact on DBR Accuracy Teresa J. LeBel 1, Amy M. Briesch 1, Stephen P. Kilgus 1, T. Chris Riley-Tillman.
Advertisements

CAPA California Alternate Performance Assessment (CAPA) Presented by Contra Costa SELPA with information provided by California Department of Education.
Statistics Review – Part II Topics: – Hypothesis Testing – Paired Tests – Tests of variability 1.
Evidence, Ethics, and the Law Ronnie Detrich Wing Institute.
Introduction to Research Methodology
Direct Behavior Rating: An Assessment and Intervention Tool for Improving Student Engagement Class-wide Rose Jaffery, Lindsay M. Fallon, Sandra M. Chafouleas,
AGRISCIENCE CURRICULUM REVIEW Ginnie Bushong A ED 615 Investigations and Studies in Applied Research.
Assessing the Social Acceptability of Brief Experimental Analysis in the Context of a Complete Reading Intervention Program Greta Fenske, Erin Liffrig,
BHS Methods in Behavioral Sciences I April 25, 2003 Chapter 6 (Ray) The Logic of Hypothesis Testing.
Research Proposal Development of research question
T-Tests Lecture: Nov. 6, 2002.
RESEARCH METHODS IN EDUCATIONAL PSYCHOLOGY
Observing Children: A Tool for Assessment
Universal Screening and Progress Monitoring Nebraska Department of Education Response-to-Intervention Consortium.
Partnering with parents
Impact of fine arts on academic success A Comparison study EDRS 5305 Fall 2004 Dr. Teresa Cortez.
Chong Ho Yu Department of Psychology, APU 362: Research Method.
Generalizability and Dependability of Direct Behavior Ratings (DBRs) to Assess Social Behavior of Preschoolers Sandra M. Chafouleas 1, Theodore J. Christ.
High Stakes Testing EDU 330: Educational Psychology Daniel Moos.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
T tests comparing two means t tests comparing two means.
Monitoring through Walk-Throughs Participants are expected to purpose the book: The Three-Minute Classroom Walk-Through: Changing School Supervisory.
The Impact of Training on the Accuracy of Teacher-Completed Direct Behavior Ratings (DBRs) Teresa J. LeBel, Stephen P. Kilgus, Amy M. Briesch, & Sandra.
A K-12 WORKSHOP INTRODUCING INFORMAL ASSESSMENTS TO TEACHERS AND ADMINISTRATORS KAREN AUTON, SHAR BECK, AND TYLER WEST Extreme Makeover: Assessment Edition.
Overview Two paired samples: Within-Subject Designs
NCDPI Observer Calibration Tool Introduction & Demo.
Ms. Carmelitano RESEARCH METHODS EXPERIMENTAL STUDIES.
INTRODUCTION Project VIABLERESULTSRESULTS CONTACTS This study represents one of of several investigations initiated under Project VIABLE. Through Project.
Project VIABLE: Overview of Directions Related to Training to Enhance Adequacy of Data Obtained through Direct Behavior Rating (DBR) Sandra M. Chafouleas.
Adolescent Literacy – Professional Development
+ Development and Validation of Progress Monitoring Tools for Social Behavior: Lessons from Project VIABLE Sandra M. Chafouleas, Project Director Presented.
NC Teacher Evaluation Process
Training Interventionists to Implement a Brief Experimental Analysis of Reading Protocol to Elementary Students: An Evaluation of Three Training Packages.
NCDPI Observer Calibration Training Introduction & Demo.
Research in Dental Hygiene 14. Dental Public Health & Research: Contemporary Practice for the Dental Hygienist, 3/e Christine Nielsen Nathe Copyright.
Information commitments, evaluative standards and information searching strategies in web-based learning evnironments Ying-Tien Wu & Chin-Chung Tsai Institute.
Training Individuals to Implement a Brief Experimental Analysis of Oral Reading Fluency Amber Zank, M.S.E & Michael Axelrod, Ph.D. Human Development Center.
steps in psychological research
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Academic Research Academic Research Dr Kishor Bhanushali M
Cat 2 Non Experimental Research Projects Day Competition 2009.
Playground Settings and the Impact of Recess on Classroom Attention Christine Peterson, B.A., M.S.E. Psychology Department Human Development Center University.
Direct Behavior Rating: Using DBR for Intervention.
(c) 2007 McGraw-Hill Higher Education. All rights reserved. Accountability and Teacher Evaluation Chapter 14.
Statistics for Psychology CHAPTER SIXTH EDITION Statistics for Psychology, Sixth Edition Arthur Aron | Elliot J. Coups | Elaine N. Aron Copyright © 2013.
Evaluating VR Systems. Scenario You determine that while looking around virtual worlds is natural and well supported in VR, moving about them is a difficult.
By: Jill Mullins. RtI is… the practice of providing high-quality instruction/intervention matched to student needs and using learning rate over time and.
Minnesota Manual of Accommodations for Students with Disabilities Training January 2010.
Updated Section 31a Information LITERACY, CAREER/COLLEGE READINESS, MTSS.
Science Fair Log Book Student’s Name Teacher Name.
T tests comparing two means t tests comparing two means.
Research Methods in Psychology Introduction to Psychology.
NCEXTEND1 Alternate Assessments of: English Language Arts/Reading 3  8, Mathematics 3  8, and Science 5 & 8 English II, Math I, and Biology at Grade.
Training Strategies to Improve Accuracy Sayward E. Harrison, M.A./C.A.S. T. Chris Riley-Tillman, Ph.D. East Carolina University Sandra M. Chafouleas, Ph.D.
Assessing and Teaching Students with Mild/Moderate Disabilities Chapter 3 Assessments.
Impact of fine arts on academic success A Comparison study Deborah Jones & Debra Villalobos EDRS5305 Fall 2004 Dr. Teresa Cortez.
Project VIABLE - Direct Behavior Rating: Evaluating Behaviors with Positive and Negative Definitions Rose Jaffery 1, Albee T. Ongusco 3, Amy M. Briesch.
Psychology Investigations A handy guide to module 2542 Nicola Santamaria January 2001.
ACTION RESEARCH PLAN PROPOSAL PRESENTATION Tracie Shehan EDU671 Instructor: Holley Prescott May 28, 2015.
Actions Speak Louder than Words: An Analysis of ABCT Poster Methodology Procedure Data was collected from the following poster sessions: 1a, 1b, 5a, 8a,
Title of your Poster Presentation Student Presenter’s Name(s), Major(s) or Course Name · Mentor’s Name and Title, Department Insert Text Here… (replace.
Data-based Decision Making: More than the Data Ronnie Detrich Randy Keyworth Jack States Wing Institute Cal-ABA, March 2009.
Preparing the New Professionals: Assessing the impact of library internships on graduate student success Tyler Scott Smith, Intern and Fieldwork Supervisor.
Research and Evaluation
9 Procedure for Conducting an Experiment.
Title of your Poster Presentation
Literature Referenced Relationship of Variables
Assessing Students with Special Needs
Presentation transcript:

Recent public laws such as Individuals with Disabilities Education Improvement Act (IDEIA, 2004) and No Child Left Behind Act (NCLB,2002) aim to establish a climate of accountability for educators. Teachers and administrators are now mandated to collect and report data on all students prior to making educational decisions. Educators are in need of empirically based and efficient data collection methods. To date, there has been considerable research on academic behavior monitoring methods such as Curriculum Based Measurement and high stakes testing. Unfortunately, social behavior monitoring techniques have not enjoyed the same attention in the research literature. The current study aims to gather empirical support for a behavior rating tool called Direct Behavior Ratings (DBRs) that can be used by teachers to rate social or academic student behaviors in the classrooms. DBR is a time-efficient observational system in which target behaviors are operationally defined and ratings are collected on a daily or weekly basis to be shared with parents and administrators. DBR is a less complicated method than Systematic Direct Observation and may require less training. Some older research studies concluded that training increased the accuracy of Direct Behavior Ratings (Madle, Neisworth, Kurtz, 1980) while a more recent study found that short training sessions did not result in a significant difference (Chafeouleas, McDougal, Riley- Tillman, Ponahon, & Hilt, 2005). The purpose of this study was to investigate whether training the raters in the use of DBRs, increases the accuracy of the ratings. Training of educators is a time-consuming, financially costly endeavor. Prior to investing a considerable amount of time and funds on teacher training, we need to be fully informed on the effects of training and weigh its potential costs and benefits. Two independent samples t-tests assuming unequal variances were applied to the DBR data obtained by the trained (n=26) and untrained control (n=33) group of observers for the two target behaviors: actively manipulating legos, an on-task behavior and visually distracted, an off task behavior. Null hypothesis stating that the population means are equal for both the trained and the untrained group: [H0: µ 1= µ 2 ] for the actively manipulating legos (on-task) behavior was rejected, because the t-tests yielded significant t-values [t (54) = 1.67, p<.05] between the two means. The null hypothesis stating that the population means are equal for both the trained and the untrained group: [H0: µ 1= µ 2 ] for the visually distracted (off-task) behavior was also rejected, because the t-tests yielded significant t-values [t (58) = 1.67, p<.05] between the two means. The trained group’s mean for the actively manipulating legos (on-task) behavior (M trained = 16.06) was significantly different than the untrained group’s mean (M untrained = 29.35). The resulting means for the visually distracted behavior was also significantly different (M trained = and M untrained = 25.63). Coding reliability was assessed by having an independent judge code approximately 10% of the protocols collected on each of the three sessions of the study. Interrater reliability, as assessed by Pearson correlation, was.99. The mean DBR ratings for brief familiarization and formal training groups were compared to the SDO data resulting in the following effect sizes: For the visually distracted behavior, effect size of formal training versus a brief familiarization was very large: d = For the actively manipulating legos behavior, effect size of a formal training versus brief familiarization was large: d =.89. Study utilized a between-subjects design and was set in a controlled environment. Fifty-nine undergraduate students from a large Southeastern university were recruited as participants. Students received course credit for their participation in the study. The training condition included 26 participants who were trained for 30 minutes on DBR methods. Practice video clips were utilized and immediate feedback was provided. The control condition included 33 participants who were presented with a 5 minute brief-familiarization session on DBR methods. Mock practice video clips were utilized and no feedback was provided. Both groups of participants then viewed the same 12 video clips and rated the same two target behaviors using the DBR method. Results suggest that observers who are formally trained for 30 minutes in the use of DBRs rate more accurately than observers who are only familiarized for 5 minutes. Current study supports the premise that a 30-minute group training of teachers may be a worthwhile endeavor in increasing the accuracy of the DBRs and therefore result in empirically supported, well-informed educational decision making for the students. Future research should recruit public school teachers as observers in order to explore the training’s effects on accuracy of the ratings in natural classroom settings. Figure 1. This line graph illustrates the mean DBR ratings for the visually distracted behavior that were completed by 26 formally trained participants. Each participant’s mean rating is a line on a continuous scale (0-100 mm) as a function of the 4 students in the video clips. Students are labeled 1 through 4. Figure 2. This line graph illustrates the mean DBR ratings for the visually distracted behavior that were completed by 33 briefly familiarized participants. Each participant’s mean rating is a line on a continuous scale (0-100 mm) as a function of the 4 students in the video clips. Students are labeled 1 through 4. Figure 3. This line graph illustrates the mean DBR ratings for the actively manipulating legos behavior that were completed by 26 formally trained participants. Each participant’s mean rating is a line on a continuous scale (0-100 mm) as a function of the 4 students in the video clips. Students are labeled 1 through 4. Figure 4. This line graph illustrates the mean DBR ratings for the actively manipulating legos behavior that were completed by 33 briefly familiarized participants. Each participant’s mean rating is a line on a continuous scale (0-100 mm) as a function of the 4 students in the video clips. Students are labeled 1 through 4. SUMMARY AND CONCLUSIONS INTRODUCTIONRESULTS MATERIALS & METHODS RESULTS Effects of Training on the Accuracy of Direct Behavior Ratings Mine Dincer Schlientz & T. Chris-Riley-Tillman, Ph.D. Name Department School or College of Arts and Sciences East Carolina University Greenville, North Carolina BIBLIOGRAPHY Madle R. A., Neisworth, J. T. & Kurtz P. D. (1980). Biasing of hyperkinetic behavior ratings by diagnostic reports: Effects of observer training and assessment method. Journal of Learning Disabilities, 13, Chafouleas, S. M., McDougal, J. L., Riley-Tillman, T. C., Ponahon C. J., & Hilt, A. M. (2005). What do daily behavior report cards (DBRCs) measure?An initial comparison of DBRCs with direct observation for off-task behavior. Psychology in the Schools, 42, Project VIABLE: Direct Behavior Ratings (DBR) Principal Investigators: Chafouleas, S. M., Riley-Tillman, T. C., Christ, T. J. & Sugai, G. Preparation of this poster was supported by a grant from the Institute for Education Sciences (IES), U.S. Department of Education (R324B060014).