Presentation is loading. Please wait.

Presentation is loading. Please wait.

MiData Innovation in Data Collection and Analysis March, 2014

Similar presentations


Presentation on theme: "MiData Innovation in Data Collection and Analysis March, 2014"— Presentation transcript:

1 MiData Innovation in Data Collection and Analysis March, 2014
For internal communication at this point: Schedule time with TAU and PLU Work on adjusting this content for communication with pilot (Eaton) March, 2014 SPDG Evaluators’ PLC

2 Acknowledgements Ottawa Area ISD Enterprise Systems Team within the Technology Department MiBLSi Team Eaton RESA and District Staff

3 Agenda MiBLSi’s History of Data Collection and Use within an MTSS Framework School-Level Report Next Steps and Application of Usability Testing Cycles

4 Why does MiBLSi ask schools, districts and ISDs to collect specific data?
So that teams have access to data for effective problem-solving and planning To inform ways to best support our partnering ISDs, districts and schools to be successful as they implement MTSS For reporting progress to MiBLSi’s funding agencies and stakeholders

5 Where We’ve Been Paper-pencil copies of data
PBIS Evaluation and DIBELS Data System Excel spreadsheets Information on website about schools with data submitted SPSS file and multiple Excel spreadsheets Filemaker Pro MiBLSi Database

6 Why open up MiData to our
project partners? Benefits for: Data collection & accuracy Access to data Reporting & data analysis At the school, district, ISD, and project levels

7 MiBLSi Database Development Information, Hardware and Server Tool Requirements
Web-based SQL Server 2012 Database Uses .NET platform (version 4.5) MVC website development tool SQL Server Reporting Services IIS 7 Hosting Service

8 Gradual Addition of Features
School-level reports, data entry and access District and ISD level reports, data entry Dashboards Manualization and how-to guides

9 Usability Testing: Big Picture
FALL 2013 SPRING 2014 FALL 2014 Make necessary revisions, develop user supports (manuals, how-to videos) Launch MiData Access and Data Entry on a Large Scale Internal testing External pilot Make necessary revisions Second external pilot We are here.

10 Fit and Overlap Given MiBLSi’s focus on systems level supports, MiData is designed to primarily support ISD, district, and project-level data analysis and decision making. The reports from MiData will be designed to support the continuous school improvement process. It is not intended to be a place to house every type of data. Other systems already exist that provide very powerful ways to analyze data that are aligned with the specific features of each measure/data set.

11 What Data? Data that will be housed
School, district or ISD level scores from measures that are specifically related to MTSS implementation and student outcomes Data that will not be housed School process rubrics Individual student scores/referrals All data collected by schools

12 MiBLSi Database

13 The School-Level MiData Report has the following features
Organized to fit the flow of our data review trainings Populates with data and organizes it depending on whether threshold cut scores were met or not Provides narrative explanations of why each analysis is important and how to interpret scores

14 Schoolwide Analysis Report

15

16

17 Schoolwide Analysis Report Development
Started with a Word document template (with a ton of notes/comments) Compared desired fields with data that already existed within the database, import features/specs Designed queries to pull data from the database Formatted data in the report

18 Report Components Is there a problem/gap with academic (reading) outcomes? Tier 1, 2, 3 School-wide Reading Outcomes Phonemic Awareness, Alphabetic Principle, Fluency & Comprehension Outcomes for Subgroups by Race/Ethnicity Summary of precise problem/gap Working hypotheses Why do we have a problem/gap & what could we do to address it? Implementation Fidelity of a School-wide Reading Model School-wide Student Behavior Outcomes Implementation Fidelity of School-wide Positive Behavioral Interventions and Supports (Tier 1, 2, 3)

19 So, basically, we would like you to READ.
Pre-Correct In order to get the most out of the information in this report it is necessary that we READ the text prior to looking at the data. Throughout this section we will prompt you to READ the text due to the value of the information provided when you READ. So, basically, we would like you to READ.

20 An Example: 1a-c. Tier 1 School-wide Reading Outcomes based on Composite (Tier 1, 2, 3)

21 1d-f. Report Analysis by Skill Area
* 1d-f. Report Analysis by Skill Area

22 1g. Reading Outcomes by Race/Ethnicity
* 1g. Reading Outcomes by Race/Ethnicity

23 2. Summary of the Precise Problem/Gap
* 2. Summary of the Precise Problem/Gap ✔ = needs support

24 Working Hypotheses: What factors might be causing our gap?
* Working Hypotheses: What factors might be causing our gap?

25 Factor: 4a. Implementation of a School-wide Reading Model
* Factor: 4a. Implementation of a School-wide Reading Model

26 Factor: 4b. School-wide Student Behavior Outcomes
* Factor: 4b. School-wide Student Behavior Outcomes

27 . . . Referrals by Race/Ethnicity
* . . . Referrals by Race/Ethnicity

28 . . . SRSS, Referrals by Problem Behavior, Location & Motivation
* . . . SRSS, Referrals by Problem Behavior, Location & Motivation

29 Factor: Implementation of PBIS
* Factor: Implementation of PBIS

30 * Example Hypothesis: Data Confirm Hypothesis:
SWIS Average referral per day per month is higher than the national median in 4 of the 5 past months. Benchmarks of Quality (BoQ) score of 45% indicates that SW-PBIS is not being implemented with fidelity.

31 Current Plans for Data Entry
Coaches will enter the majority of data into MiData 3x per school year during the coaching meetings prior to data review trainings. Our goal is to make this as easy as possible through multiple rounds of usability testing. Without even knowing exactly how to do it, 61% of participants at winter data review said they could enter their own data by the spring.

32 Getting ahead of a Challenge for District and ISD Data Review
Just as schools struggle with gathering and effectively analyzing a lot of data, so will districts and ISDs (2 fold, 10 fold, 20 fold, 100 fold. . .) 1 point of data entry into MiData at the school level means that the district, ISD, and MiBLSi will have immediate access to that same data.

33 Usability Testing: Data Entry
Designed manual data entry fields Anna entered some data, documented time, strengths, needs Ron made adjustments Anna entered more data and documented time, strengths, needs School teams entered some data, documented time and needs Anna will try process with a fresh school Revisions and process documentation/manualization Anna will guide a fresh school to enter their own data, document time, strengths, needs “Final” revisions

34 Reports for exploration . . .
Even Better! Reports for exploration . . . and reports to answer precise questions in a specific sequence

35

36

37 Challenges Buy-in and communication
Time and resources for training and ongoing support The need for both consistency and flexibility that will help us achieve the big goal of effectively engaging in data-driven decision making at multiple levels of the cascade Fit with other critical processes and systems without unnecessary duplication of efforts

38 Usability Testing: Big Picture
FALL 2013 SPRING 2014 FALL 2014 Make necessary revisions, develop user supports (manuals, how-to videos) Launch MiData Access and Data Entry on a Large Scale Internal testing External pilot Make necessary revisions Second external pilot We are here.

39 Thank you! Anna Harms MiBLSi Evaluation & Research Coordinator
Ron Oskam OAISD Application Developer Greg Shepard OAISD Enterprise Systems Manager


Download ppt "MiData Innovation in Data Collection and Analysis March, 2014"

Similar presentations


Ads by Google