Presentation is loading. Please wait.

Presentation is loading. Please wait.

Three-Year Effects of a Benchmark Assessment System on Student Achievement Presented the CCSSO National Conference on Student Assessment June 29, 2012.

Similar presentations

Presentation on theme: "Three-Year Effects of a Benchmark Assessment System on Student Achievement Presented the CCSSO National Conference on Student Assessment June 29, 2012."— Presentation transcript:

1 Three-Year Effects of a Benchmark Assessment System on Student Achievement Presented the CCSSO National Conference on Student Assessment June 29, 2012 Shana Shaw, Ph.D. Jeffrey C. Wayman, Ph.D. Cindy Bochna, Ph.D. Joe OReilly, Ph.D.

2 Todays Talk Part I: District Context (Cindy) Part II: Research Study (Shana)

3 District Context

4 Mesa Public Schools (MPS) Mesa, Arizona Suburban with an inner city core and serving two Native American communities 65,000 students 65% free & reduced lunch rate 50% Anglo, 40% Latino, 5% American Indian Generally slightly above state averages on academic indicators

5 Acuity Implementation in MPS Goal 1: Implement an online formative assessment system. Initial focus on giving predictive tests and using results to predict state assessment performance Goal 2: Help teachers incorporate Acuity in ongoing instruction & use it as a tool for PLCs Creating custom formative tests, using instructional resources Acuity Unwired

6 Acuity Timeline in MPS

7 MPS Acuity Training Model

8 Acuity Training Characteristics District hired an experienced Acuity trainer to help with implementation. 1.Training by school request Principals set up 1, 2, and 4 hour sessions for their staff. If possible, training is scheduled during the school day with sub coverage. 2.Training by teacher request 4 hour workshops are held through professional development.

9 Acuity Training Characteristics Initial Training Navigating the system, accessing reports Subsequent Trainings Using reports to understand student achievement Assigning instructional resources based upon results from predictive tests Advanced Uses Teachers learn to design custom tests for classroom use as well as triangulate data to other district data sources.

10 Acuity Training Characteristics Communication Continually updated website with relevant information and a news feed.

11 Teacher Attitudes Toward Acuity

12 Survey Method Online surveys (2009 – 2012) All teachers surveyed and asked to complete Survey of Educator Data Use online (Wayman et al. 2009). Incentives provided for school level response rates (e.g., 1 case of paper for 50%). Teachers were also asked to complete an electronic survey via Acuity newsletter (2011). Survey asked about Acuity likes, dislikes and additional training desires as well as Acuity items from the Survey of Educator Data Use.

13 Teachers Are Comfortable Using Acuity Survey ItemAgreement Acuity is easy to use.67% I know how to use Acuity.89% I dont have to wait for data.66% The data is accurate.77% Acuity is dependable.71% The data is always up to date.83% Source: 2012 Data Use Survey

14 Agree59%58%70%69% Disagree41%42%30%31% Teachers Felt Acuity Helped Their Teaching Data Use Survey Item: Acuity makes my work easier.

15 Agree68%75%84%79% Disagree32%25%16%22% Teachers Felt Acuity Helped Improve Instruction Data Use Survey Item: Acuity helps me improve my instruction.

16 Experience 0-5 Yrs 6-10 Yrs Yrs 16+ Yrs Easy to Use74%69%66% Improves Instruction80%76%78%79% Makes Work Easier78%72%64%67% Attitudes Toward Acuity by Teaching Experience Category Source: 2012 Data Use Survey

17 District Context Wrap-Up We have a system that teachers feel: they know how to use, is accurate and timely, helps teachers in their work, improves instruction. How well did that transfer into academic achievement?

18 Research Study

19 National Context: Benchmark Assessment Research Studies generally show small, positive effects on achievement, but results are inconsistent: Vary by content or student sub-groups Higher achievement gains when: Initiatives are sustained. Data used to select effective interventions. Sources: Carlson et al., 2011; Henderson et al., 2007; May & Robinson, 2007; Quint et al., 2008; Slavin et al., 2011

20 Current Study Study Context: Year 3 of Acuity implementation (2011) Small group training at schools, and increased district support. Variables: Student state test scores (AIMS) linked to teachers instructional use of Acuity (measured by use logs). Characteristics of students, teachers, and schools.

21 Current Study Purposes: Explore what Acuity use looks like after three years. Assess cumulative impact of Acuity use on achievement.

22 Analytic Method Cross-classified HLMs: Explore associations between Acuity use (2010 and 2011) and 2011 student achievement (grades 4-8). Disaggregated by content (reading/math) and school- level (elementary/JHS) Three HLM outcomes of interest: 1.Standardized regression coefficients 2.Statistical significance 3.Relative impact of Acuity use on achievement

23 Cross-Classified HLM Example Student (Level-1): 2011 AIMS Reading Score (j1j2)k = π 0 (j1j2)k + π 1 (j1j2)k (gender) (j1j2)k + π 2 (j1j2)k (free lunch status) (j1j2)k + π 3 (j1j2)k (ethnicity) (j1j2)k + π 4 (j1j2)k (2009 AIMS Reading score) (j1j2)k + e (j1j2)k Teachers (Level-2): π 0(j1j2)k = 00 k + 01k (2011 teacher experience) (j1j2)k + 02 k (2010 teacher experience) (j1j2)k + 03 k (2011 teacher Acuity use) (j1j2)k + 04 k (2010 teacher Acuity use) (j1j2)k + u j1k + u j2k Source: Raudenbush & Bryk, 2002

24 Research Study Results

25 What Does Acuity Use Look Like After 3 Years? After large increases from 2009 to 2010, Acuity use plateaued in Averaged about 10 weeks and 150 instructional uses of Acuity. Same Acuity functions used both years HLM Results (2011): Teaching experience and school-level associated with teachers Acuity use.

26 What Did Teachers Instructional Use of Acuity Look Like in 2011? Acuity use spiked after predictive assessments were given in August (Form A), October (Form B), and January (Form C).

27 Teachers Acuity Use & Elementary Achievement Current teachers Acuity use (2011): Significantly related to reading ( β =.05, p <0.01) Marginally related to math ( β =.03, p = 0.06) Previous teachers Acuity use (2010): Not significantly related to achievement. Notes : 1.Student background factors were significant. 2.Teaching experience was not significant.

28 Teachers Acuity use & Junior High Achievement Neither current (2011) nor previous (2010) teachers Acuity use was significantly associated with junior high achievement in Notes : 1.Student background factors were significant. 2.Teaching experience was not significant.

29 Impact of Teachers Acuity Use Relative to Other Factors Magnitude of Acuity use impact: Extra 6 weeks of Acuity use (2011) associated with: 1- to 2-point increase in elementary reading 1-point increase in elementary math Acuity use vs. student factors: Acuity use (2011) had weaker relationship with elementary achievement than student factors (e.g., FERPL status). Acuity use vs. teacher experience: Acuity use had stronger associations than experience.

30 What Do These Results Mean? Possibility #1: Acuity use doesnt have much of an impact on achievement. Small, positive associations with elementary achievement in both 2010 and Teachers report that the system is useful and that it helps their instruction. We dont think the data support this conclusion.

31 What Do These Results Mean? Possibility #2: Need more time to significantly impact achievement in MPS. Three years not enough to get critical mass of teacher skill in using Acuity, and subsequent achievement benefits. Possibility #3: Need to move to next level with teachers Acuity use. Acuity use hasnt changed much. Virtually same type of Acuity use as 2010, virtually same associations with achievement.

32 Recommendations To take Acuitys impact to the next level, we recommended MPS focus on these areas: Explore how and when teachers notions about data use are compatible with Acuity. Highlight new areas of compatibility between Acuity resources and teachers acknowledged data needs. Source: Cho & Wayman (2012)

33 Contact & Follow-Up Information If you have questions about the research conducted for this presentation, please contact Jeff or Shana. For more information on this and similar studies conducted by these researchers, please go to:

34 References Carlson, D., Borman, G. D., & Robinson, M. (2011). A multistate district-level cluster randomized trial of the impact of data-driven reform on reading and mathematics achievement (2011). Educational Evaluation and Policy Analysis, 33 (3), 378–398. Cho, V. & Wayman, J. C. (2012, April). Districts' efforts for data use and computer systems. Paper presented at the 2012 Annual Meeting of the American Educational Research Association, Vancouver, Canada. Henderson, S., Petrosino, A., Guckenburg, S., & Hamilton, S. (2007). Measuring how benchmark assessments affect student achievement (Issues & Answers Report, REL 2007 No. 039). Washington, DC: U.S. Department of Education, Institute of Education Sciences. May, H., & Robinson, M.A. (2007). A randomized evaluation of Ohios Personalized Assessment Reporting System (PARS). Madison, WI: Consortium for Policy Research in Education. Quint, J., Sepanik, S., & Smith, J. (2008). Using student data to improve teaching and learning: Findings from an evaluation of the Formative Assessments of Students Thinking in Reading (FAST-R) program in Boston elementary schools. New York: MDRC. Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2 nd ed.). Thousand Oaks, CA: Sage. Slavin, R. E., Holmes, G., Madden, N. A., Chamberlain, A., Cheung, A., & Borman, G. D. (2010). Effects of a data-driven district-level reform model, Working Paper. Baltimore, MD: Center for Data-Driven Reform, Johns Hopkins University. Wayman, J. C., Cho, V., & Shaw, S. (2009). Survey of Educator Data Use. Unpublished document.

Download ppt "Three-Year Effects of a Benchmark Assessment System on Student Achievement Presented the CCSSO National Conference on Student Assessment June 29, 2012."

Similar presentations

Ads by Google