Presentation is loading. Please wait.

Presentation is loading. Please wait.

LISD Data Camp June and August, 2011 LISD TECH Center.

Similar presentations


Presentation on theme: "LISD Data Camp June and August, 2011 LISD TECH Center."— Presentation transcript:

1 LISD Data Camp June and August, 2011 LISD TECH Center

2 Welcome Necessary Forms –SB-CEUs –SHU Credit

3 Session 1 Essential Questions Many Meaning of Multiple Measures –Why use multiple measures of data? –Which ways will we use multiple measures in our school improvement process? –What data sources will we use to make decisions about student achievement?

4 Session 1 Outcomes Identify multiple measures of data Align data with their school improvement goals Develop an action plan for engaging staff in analyzing multiple measures

5 School Improvement Process

6 WE MUST UTILIZE AN INQUIRY APPROACH TO DATA ANALYSIS WE MUST USE MULTIPLE SOURCES OF DATA We need a data warehouse for our 21 st century schools WE MUST FOCUS ON DATA TO INCREASE STUDENT ACHIEVEMENT Talking Points for the Purpose of Implementing a Data Warehouse in Lenawee Schools

7 Norms for Our Work Participate actively Actively listen Seek application Press for clarification Honor time agreements and confidentiality Keep ‘side bars’ to a minimum and on topic Take care of adult learning needs

8 FERPA/HIPAA Pre-Test To be considered an “education record,” information must be maintained in the student’s cumulative or permanent folder. False, because any record that has a student name is an educational record.

9 FERPA/HIPAA Pre-Test FERPA grants parents the right to have a copy of any education record. True

10 FERPA/HIPAA Pre-Test You are in charge of a staff meeting to study student achievement on school improvement goals. As part of your meeting, you are showing a report to the entire staff that shows student scores on a common local assessment. The report shows the student names. In addition, you have given them a paper copy of the report. It is a violation of FERPA to display the results of the assessment to the entire staff. The exception would be a group of teachers working on a specific student strategies, as they are a specific population that then has a “legitimate educational interest” in the information.

11 Data Roles What roles will each member of your team play in today’s work? –Identify roles –Describe responsibilities –Hold each other accountable

12 The Many Meanings of “Multiple Measures” Susan Brookhart Volume 2009, Volume 67:3 ASCD, November 2009, pp. 6-12

13 Would you choose a house using one measure alone?

14 Guiding Principle for Multiple Measures Know your purpose! –What do you need to know? –Why do you need to know it?

15 assessment for learning –formative (monitors student progress during instruction) –placement (given before instruction to gather information on where to start) –diagnostic (helps find the underlying causes for learning problems) –interim (monitor student proficiency on learning targets) assessment of learning –summative (the final task at the end of a unit, a course, or a semester) Purposes of Assessments Sources: Stiggins, Richard J, Arter, Judith A., Chappuis, Jan, Chappius, Stephen. Classroom Assessment for Student Learning. Assessment Training Institute, Inc., Portland, Oregon, 2004. Bravmann, S. L., “P-I Focus: One test doesn’t fit all”, Seattle Post-Intelligencer, May 2, 2004.Seattle Post-Intelligencer Marshall, K. (2006) “Interim Assessments: Keys to Successful Implementation”. NewYork: New Leaders for New Schools.“Interim Assessments: Keys to Successful Implementation”.

16 Why use multiple measures for decisions in education? Construct validity –The degree to which a score can convey meaningful information about an attribute it measures Decision validity –The degree to which several relevant types of information can inform decision- making

17 Multiple Measures Measures of different constructs Different measures of the same construct Multiple opportunities to pass the same test

18 Using Multiple Measures for Educational Decisions Conjunctive Approach (All measures count) Compensatory Approach (High performance on one measure can compensate for lower performance on another measure) Complementary Approach (High performance on any measure counts)

19 Examples NCLB accountability is conjunctive (i.e., aggregate and subgroups must reach threshold to make AYP) Most classroom grading policies are compensatory (i.e., average, percentage) Getting a driver’s license is complementary (i.e., passing one of the requirements when you want)

20 Using Multiple Measures for Educational Decisions Conjunctive Approach (All measures count) Compensatory Approach (High performance on one measure can compensate for lower performance on another measure) Complementary Approach (High performance on any measure counts) Measures of different constructs Different measurers of the same construct Multiple opportunities to pass the same test

21 Examples MEAP measures different constructs in mathematics (i.e., measurement, numbers and operations, geometry, algebra, probability) Retelling, constructed responses, and cloze tasks are different measures of the same construct (comprehension) Some students utilize multiple opportunities to take the ACT (i.e., scholarships, NCAA eligibility)

22 Using Multiple Measures for Educational Decisions Conjunctive Approach (All measures count) Compensatory Approach (High performance on one measure can compensate for lower performance on another measure) Complementary Approach (High performance on any measure counts) Measures of different constructs School accreditation ratings based upon student achievement meeting identified targets in Reading, Math, Science, and Social Studies An outside agency identifies the “best schools”, identified by computing an index of weighted scores AYP “Safe Harbor” by having a percentage of students who scored below proficiency decreasing by ten percentage points from the previous year Different measurers of the same construct Students have to pass a reading comprehension test on two stories at the same reading level before the student is allowed to read stories at the next higher reading level Teachers determine standards-based grades in a course using scores on multiple assessments measuring the same GLCE or HSCE Teachers allow student choice on assessment tasks to demonstrate their understanding of the learning targets for a unit Multiple opportunities to pass the same test Students meeting all requirements will graduate after passing an exit exam, no matter how many opportunities Teachers allow students to retake a unit test to demonstrate mastery of the unit’s outcomes Students must pass one mathematics test in order to graduate; students can choose the state test or an end-of- course exam in either Algebra I or Geometry

23 Suggestions for Using Multiple Measures for Decision Making Classroom assessments linked to the same construct to determine mastery Granting credit for graduation requirements Teacher evaluations

24 Questions? Stan Masters Coordinator of Instructional Data Services Lenawee Intermediate School District 4107 N. Adrian Highway Adrian, Michigan 49921 Phone: 517-265-1606 Email: stan.masters@lisd.usstan.masters@lisd.us Skype ID: stan.masters Data Warehouse webpage: www.lisd.us/links/data

25 LISD Data Camp June and August, 2011 LISD TECH Center

26 Session 2 Essential Questions DataDirector Functions –What are the important aspects of various DataDirector functions? –What decisions must be made in sharing DataDirector products with others? –How will we build our 2011-2012 assessment calendar?

27 Session 2 Outcomes Describe the functions of DataDirector tabs Identify the permissions in sharing a DataDirector product Create an assessment calendar for 2011-2012 school year

28 School Improvement Process

29 Norms for Our Work Participate actively Actively listen Seek application Press for clarification Honor time agreements and confidentiality Keep ‘side bars’ to a minimum and on topic Take care of adult learning needs

30 Data Roles What roles will each member of your team play in today’s work? –Identify roles –Describe responsibilities –Hold each other accountable

31 How do you develop a monitoring plan? Identify specific learning indicators Create data collection templates Schedule assessment calendar –collaborative collection and analysis Source: “Developing a Monitoring Plan”. Maryland Department of Education. Accessed May 25, 2010 from http://mdk12.org/data/progress/developing.htmlhttp://mdk12.org/data/progress/developing.html Video Source: Reeves, D. (2009). “Planning for the New Year”. Accessed May 25, 2010 from http://www.leadandlearn.com/webinars http://www.leadandlearn.com/webinars

32 Assessment Calendars

33 Time Elements of an Assessment Calendar Source: White, S. H. (2005). “Beyond the Numbers: Making Data Work for Teachers and School Leaders”. Lead and Learn Press: Englewood, CO When will we administer the assessment? When will we collect the data? When will we disaggregate the data? When will we analyze the data? When will we reflect upon the data? When will we make recommendations? When will we make the decisions about the recommendations? When will we provide written documentation about the decisions? When will we share the data with other stakeholders?

34 Components of an Assessment Calendar Source: White, S. H. (2005). “Beyond the Numbers: Making Data Work for Teachers and School Leaders”. Lead and Learn Press: Englewood, CO Norm-referenced tests State assessments Criterion-referenced tests Writing assessments End-of-course assessments Common assessments Performance assessments Unit tests Other

35 Questions? Stan Masters Coordinator of Instructional Data Services Lenawee Intermediate School District 4107 N. Adrian Highway Adrian, Michigan 49921 Phone: 517-265-1606 Email: stan.masters@lisd.usstan.masters@lisd.us Skype ID: stan.masters Data Warehouse webpage: www.lisd.us/links/data


Download ppt "LISD Data Camp June and August, 2011 LISD TECH Center."

Similar presentations


Ads by Google