Presentation is loading. Please wait.

Presentation is loading. Please wait.

DOES THE USE OF DATA ANALYSIS TEAMING FOR STUDENT ACHIEVEMENT AND LEVEL OF STUDENT WORK IMPROVE STUDENT PERFORMANCE IN READING? Christina M. Marco-Fies.

Similar presentations


Presentation on theme: "DOES THE USE OF DATA ANALYSIS TEAMING FOR STUDENT ACHIEVEMENT AND LEVEL OF STUDENT WORK IMPROVE STUDENT PERFORMANCE IN READING? Christina M. Marco-Fies."— Presentation transcript:

1 DOES THE USE OF DATA ANALYSIS TEAMING FOR STUDENT ACHIEVEMENT AND LEVEL OF STUDENT WORK IMPROVE STUDENT PERFORMANCE IN READING? Christina M. Marco-Fies Indiana University of Pennsylvania Dissertation Defense March 11, 2013

2 Dissertation Defense  Rationale for study  Literature review  School district data collection timeline  Study design  Research questions and variables  Procedures  Statistical analyses and results  Conclusions  Committee discussion

3 Rationale for Study  A Nation at Risk (NCEE, 1983)  PISA Results (Lemke et al., 2001)  US Dept. of Ed. Office of Spec. Ed. Programs (2005)  NCLB (2001)  ARRA/Race to the Top (US Dep. of Ed., 2009)

4 Literature Review  Response to Intervention (Batsche et. al, 2006)  Assessment  Data Analysis Teaming  Fidelity  Reading Research (NRP, 2000)  Classroom Walkthroughs (Teachscape, 2010)

5 School District Data Collection Timeline DateActivity September 2006School District begins collecting DIBELS data January 2008School District begins training and initiating of data analysis teaming for DIBELS data Fall 2009School District trains select staff with Teachscape Classroom Walkthrough system November 2009School District staff begin to collect Classroom Walkthrough data January 2010School District data analysis teaming occurs for both DIBELS and Classroom Walkthrough data

6 Study Design SexPre-ORFPost-ORF Collecting DIBELS SexPre-ORFPost-ORF Data Teaming for DIBELS SexPre-ORFPost-ORF Data Teaming for DIBELS and Walkthrough

7 Research Questions and Variables Question 1: Does collecting DIBELS data increase the percentage of students reaching benchmark in reading compared to a national sample of students? Does student performance differ depending on sex? Latent VariableObserved VariableInstrument or Source Validity/ Reliability Reading Data Teaming Strategy No Initiation of DATSchool RecordsExcellent Pre- and Post- Reading Performance ORF Score Winter to Spring ’07 DIBELS ORFVery Good

8 Research Questions and Variables Question 2: Does using data analysis teaming to discuss DIBELS data improve student performance in reading beyond levels that were attained when data were collected and not analyzed? Does student performance differ depending on sex? Latent VariableObserved VariableInstrument or Source Validity/ Reliability SexMale/FemaleSchool RecordsExcellent Reading Data Teaming Strategy DIBELS DATSchool RecordsExcellent Pre- and Post- Reading Performance ORF Score Winter to Spring ’07 & ’09 DIBELS ORFVery Good

9 Research Questions and Variables Question 3: Does analyzing DIBELS data and walkthrough data for data analysis teaming improve student reading performance beyond no data analysis teaming or data analysis teaming for DIBELS data only? Does student performance differ depending on sex? Latent VariableObserved VariableInstrument or Source Validity/ Reliability SexMale/FemaleSchool RecordsExcellent Reading Data Teaming Strategy DIBELS and Walkthrough DAT School RecordsExcellent Pre- and Post- Reading Performance ORF Score Winter to Spring ‘07, ’09, and ‘10 DIBELS ORFVery Good

10 Procedures 1. Archival data gathered from school district 2. Review data teaming logs 3. Analyze data  Winter ‘07 and Spring ‘07 DIBELS  Winter to Spring ‘07 and Winter to Spring ‘09 DIBELS  Winter to Spring’07, Winter to Spring ’09, and Winter to Spring ‘10 DIBELS

11 Statistical Analysis and Results  Sample  174 elementary school students  1 st through 4 th grade  Demographics  Complications  Lack of data log availability  Lack of national norms  Lack of sample for all grades  Lack of demographic information

12 Statistical Analysis and Results Question 1: Does collecting DIBELS data increase the percentage of students reaching benchmark in reading compared to a national sample of students? Does student performance differ depending on sex? Hypothesis: Collecting DIBELS data will not increase percentage of benchmark students. Statistic: One Sample t-Test

13 Statistical Analysis and Results One-Sample t-Test for DIBELS Oral Reading Fluency Winter to Spring 2007 Improvement and DIBELS 2001-2002 Norms Descriptive Statistics nMeanSD Study Sample Improvement17417.3110.78 Winter 2007 ORF17439.8630.47 Spring 2007 ORF17457.1630.37 DIBELS Norms Improvement23.76 Winter 2002 ORF3,741036.8933.14 Spring 2002 ORF3,701760.6537.99 One-Sample t-Test EffectMean DifferencetdfSign. Improvement-6.450-7.89173<.001 Winter2.961.28173>.05 Spring-3.49-1.52173>.05 Note. Mean numbers are expressed in words correct per minute (wcpm).

14 Statistical Analysis and Results  Hypothesis was not supported  Students in the study did not show as much improvement as the national sample  Possible Reasons  Demographic differences  Instruction received

15 Statistical Analysis and Results Question 2: Does using data analysis teaming to discuss DIBELS data improve student performance in reading beyond levels that were attained when data were collected and not analyzed? Does student performance differ depending on sex? Hypothesis: Using DIBELS for DAT will improve student performance in reading. Statistic: ANOVA-RM

16 Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Winter to Spring 2007 and 2009 Descriptive Statistics nMeanSD Winter 200717439.930.5 Male8839.128.4 Female8640.732.6 Spring 200717457.230.4 Male8857.227.8 Female8657.233.0 Winter 2009174101.333.6 Male88102.031.4 Female86100.635.9 Spring 2009174118.132.6 Male88118.532.2 Female86117.733.1

17 Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Winter to Spring 2007 and 2009 Analysis of Variance – Repeated Measures EffectMS FdfpPartial Eta Squared Time234,096.8 1,369/73<.001.888 Sex3.5.01>.05.000 Time*Sex75.0.43>.05.003 Error170.9516 Post Hoc Comparison of Means W 2007Sp 2007W 2009Sp 2009 W 2007-17.3*61.5*78.2* Sp 2007--44.2*60.9* W 2009---16.8* Note. Mean numbers are expressed in words correct per minute (wcpm). * Significant at the.001 level.

18 Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Improvement Scores 2007 and 2009 Descriptive Statistics nMeanSD 2006-200717417.310.8 Male8818.111.1 Female8616.510.4 2008-200917416.813.2 Male8816.512.1 Female8617.114.3 Analysis of Variance – Repeated Measures EffectMSFdfpPartial Eta Squared Year24.2.21>.05.001 Sex22.2.11>.05.001 Year*Sex109.2.81>.05.005 Error135.6172 Note. Mean numbers are expressed in words correct per minute (wcpm).

19 Statistical Analysis and Results DIBELS ORF Winter and Spring Percentage of Students at Benchmark Levels 2007 and 2009 AR WinterAR SpringSR WinterSR SpringLR WinterLR Spring First Grade (2007)3%4%25%28%72%68% Male3%3%22%26%75%70% Female3%5%28%29%69%66% Third Grade (2009)14%10%26%29%60%61% Male 13%10%27%28%60%61% Female16%10%24%29%59%60% Friedman TestWilcoxon Test Overallp.05 Malep.05 Femalep=.001Femalesp>.05

20 Statistical Analysis and Results  Hypothesis was not supported  Showed growth in reading over time  Did not show significant improvement after DAT for DIBELS began  Differences in benchmark levels  No differences in improvement in risk levels  Possible Reasons  DAT is not effective  Fidelity of strategies

21 Statistical Analysis and Results Question 3: Does analyzing DIBELS data and walkthrough data for data analysis teaming improve student reading performance beyond no data analysis teaming or data analysis teaming for DIBELS data only? Does student performance differ depending on sex? Hypothesis: DIBELS and walkthrough DAT will add to the improvement of student performance. Statistic: ANOVA-RM

22 Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Winter to Spring 2007 and 2010 Descriptive Statistics nMeanSD Winter 200717439.930.5 Male8839.128.4 Female8640.732.6 Spring 200717457.230.4 Male8857.227.8 Female8657.233.0 Winter 2010174108.426.2 Male88108.623.9 Female86108.128.6 Spring 2010174131.933.8 Male88131.831.9 Female86132.035.8

23 Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Winter to Spring 2007 and 2010 Analysis of Variance – Repeated Measures EffectMS FdfpPartial Eta Squared Time322,291.2 1,378.43<.001.889 Sex19.5.01>.05.000 Time*Sex36.4.23>.05.001 Error233.8516 Post Hoc Comparison of Means W 2007Sp 2007W 2010Sp 2010 W 2007-17.3*68.5*92.1* Sp 2007--51.2*74.8* W 2010---23.6* Note. Mean numbers are expressed in words correct per minute (wcpm). * Significant at the.001 level.

24 Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Improvement Scores 2007 and 2010 Descriptive Statistics nMeanSD 2006-200717417.310.8 Male8818.111.1 Female8616.510.4 2009-201017423.618.2 Male8823.218.8 Female8623.917.7 Analysis of Variance – Repeated Measures EffectMSFdfpPartial Eta Squared Year3,403.315.61<.001.083 Sex17.6.11>.05.000 Year*Sex120.2.61>.05.003 Error217.6172 Note. Mean numbers are expressed in words correct per minute (wcpm).

25 Statistical Analysis and Results DIBELS ORF Winter and Spring Percentage of Students at Benchmark Levels 2007 and 2010 AR WinterAR SpringSR WinterSR SpringLR WinterLR Spring First Grade (2007)3%4%25%28%72%68% Male3%3%22%26%75%70% Female3%5%28%29%69%66% Fourth Grade (2010)12%12%27%28%61%60% Male 12.5%13%25%28%62.5%59% Female12%12%29%27%59%62% Friedman TestWilcoxon Test Overallp.05 Malep=.001Malesp>.05 Femalep.05

26 Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Winter to Spring 2009 and 2010 Descriptive Statistics nMeanSD Winter 2009174101.333.6 Male88102.031.4 Female86100.635.9 Spring 2009174118.132.6 Male88118.532.2 Female86117.733.1 Winter 2010174108.426.2 Male88108.623.9 Female86108.128.6 Spring 2010174131.933.8 Male88131.831.9 Female86132.035.8

27 Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Winter to Spring 2009 and 2010 Analysis of Variance – Repeated Measures EffectMS FdfpPartial Eta Squared Time30,543.3 203.43<.001.542 Sex67.4.01>.05.000 Time*Sex19.7.13>.05.001 Error150.1516 Post Hoc Comparison of Means W 2009Sp 2009W 2010Sp 2010 W 2009-16.8*7.0*30.6* Sp 2009--9.7*13.8* W 2010---23.6* Note. Mean numbers are expressed in words correct per minute (wcpm). * Significant at the.001 level.

28 Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Improvement Scores 2009 and 2010 Descriptive Statistics nMeanSD 2008-200917416.813.2 Male8816.512.1 Female8617.114.3 2009-201017423.618.2 Male8823.218.8 Female8623.917.7 Analysis of Variance – Repeated Measures EffectMSFdfpPartial Eta Squared Year4001.413.81<.001.074 Sex39.1.11>.05.001 Year*Sex.3.01>.05.000 Error290.2172 Note. Mean numbers are expressed in words correct per minute (wcpm).

29 Statistical Analysis and Results DIBELS ORF Winter and Spring Percentage of Students at Benchmark Levels 2009 and 2010 AR WinterAR SpringSR WinterSR SpringLR WinterLR Spring Third Grade (2009)14%10%26%29%60%61% Male 13%10%27%28%60%61% Female16%10%24%29%59%60% Fourth Grade (2010)12%12%27%28%61%60% Male 12.5%13%25%28%62.5%59% Female12%12%29%27%59%62% Friedman TestWilcoxon Test Overallp>.05Overallp>.05 Malep>.05Malesp>.05 Femalep>.05Femalesp>.05

30 Statistical Analysis and Results  Hypothesis was not supported  Showed growth in reading over time  Significant improvement after DAT for DIBELS and Walkthrough began  Differences in benchmark levels  No differences in improvement in risk levels  Possible Reasons  Walkthrough DAT is not effective  Fidelity of Walkthrough DAT

31 Statistical Analysis and Results  Fidelity of Data Analysis Teaming for DIBELS  First Grade (2006-2007): no data teaming  Third Grade (2008-2009): 99% fidelity, 5 data logs  Fourth Grade (2009-2010): 100% fidelity, 2 data logs  Fidelity of Data Analysis Teaming for Walkthroughs  Fourth Grade (2009-2010): no data logs found

32 Conclusions  Limitations  Data not independent  Fidelity of DAT  History/treatment interaction  Convenience sample  Student differences  Implications for Practice

33 Conclusions  Future Research Directions  Fidelity of Process  Assess Fidelity  Component Effectiveness  Strategies Selected  Strategy Fidelity  Time Implementing  DAT for Other Areas  Walkthroughs and Achievement  Walkthrough Models  Student Variables  Replication Studies

34

35 Committee Discussion

36 References Batsche, G., Elliott, J., Graden, J., Grimes, J., Kovaleski, J., Prasse, D.,... Tilly, W. D. (2006). Response to intervention: Policy considerations and implementation. Alexandria, VA: National Association of State Directors of Special Education. Lemke, M., Calsyn, C., Lippman, L., Jocelyn, L., Kastberg, D., Liu, Y.,... Bairu, G. (2001). Highlights from the 2000 program for international student assessment. Washington, DC: National Center for Education Statistics. National Commission on Excellence in Education (NCEE). (1983). A nation at risk: The imperative for educational reform. Washington, DC: U.S. Government Printing Office. National Reading Panel. (2000). Teaching children to read: An evidence based assessment of the scientific literature on reading and its implications for reading instruction. Bethesda, MD: National Institute of Child Health and Human Development.

37 References No Child Left Behind Act of 2001, PL 107-110, 115 Stat. 1425, 20 U.S.C. §§ 6301 et seq. Teachscape. (2010). Classroom walkthrough. Retrieved August 1, 2010 from http://www.teachscape.com/html/ts/nps/classroom_ walkthrough.html United States Department of Education. (2009). Race to the top program executive summary. Retrieved November 7, 2010, from http://www2.ed.gov/programs/racetothetop/executive- summary.pdf United States Department of Education Office of Special Education Programs. (2005). Reading rockets: Toolkit for school psychologists. Washington, D.C.: Greater Washington Educational Telecommunications Association, Inc.


Download ppt "DOES THE USE OF DATA ANALYSIS TEAMING FOR STUDENT ACHIEVEMENT AND LEVEL OF STUDENT WORK IMPROVE STUDENT PERFORMANCE IN READING? Christina M. Marco-Fies."

Similar presentations


Ads by Google