Presentation is loading. Please wait.

Presentation is loading. Please wait.

Data Interpretation Workshop 2007. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 2 Purposes for the Day Bring context.

Similar presentations


Presentation on theme: "Data Interpretation Workshop 2007. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 2 Purposes for the Day Bring context."— Presentation transcript:

1 Data Interpretation Workshop 2007

2 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 2 Purposes for the Day Bring context and meaning to the math and reading assessment project results; Initiate reflection and discussion among school staff members related to the math and reading assessment results; Encourage school personnel to judiciously review and utilize different comparators when judging math and reading assessment results; Model processes that can be used at the school-and division-level for building understanding of the data among school staff and the broader community; and, Provide an opportunity to discuss and plan around the data

3 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 3 Agenda Understanding data—sources, categories & uses Saskatchewan Assessment for Learning Program Working with the AFL Reports –Standards and Cut Scores –Predicting –Sharing –Analysis of Strengths & Areas for Growth –Creating & Testing Hypotheses Action Planning

4 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 4 32° What might the above piece of data mean? While 32° is data, the meanings you provided were interpretation. All data is meaningless until interpreted. Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC.

5 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 5 A Data-Rich Environment Wellman & Lipton (2004) state: Schools and school districts are rich in data. It is important that the data a group explores are broad enough to offer a rich and deep view of the present state, but not so complex that the process becomes overwhelming and unmanageable. Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC.

6 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 6 International Data Sources Programme for International Student Assessment (PISA) http://snes.eas.cornell.edu/Graphics/earth%20white%20background.JPG

7 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 7 National Data Sources Pan-Canadian Achievement Program (PCAP) Canadian Test of Basic Skills (CTBS) Canadian Achievement Tests (CAT3) http://www.recyclage.rncan.gc.ca/images/canada_map.jpg

8 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 8 Provincial Data Sources Assessment for Learning (AFL) –Opportunity to Learn Measures –Performance Measures Departmentals http://regina.foundlocally.com/Images/Saskatchewan.jpg

9 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 9 Division Data Sources Division level rubrics Division bench mark assessments http://www.sasked.gov.sk.ca/branches/ed_finance/north_east_sd200.shtml

10 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 10 Local Data Sources Cum Folders Teacher designed evaluations Portfolios Routine assessment data

11 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 11 Nature of Assessment Data From Understanding the numbers. Saskatchewan Learning DefinitiveIndicative Individual Classroom School Division Provincial National International Student EvaluationsSystem Evaluations

12 12. Depth and Specificity of Knowledge From Saskatchewan Learning. (2006). Understanding the numbers. Little knowledge of specific students In-depth knowledge of specific students IndividualNationalSchoolClassroomInternationalDivisionProvincial Assessments In-depth knowledge of specific students IndividualNationalSchoolClassroomInternationalDivisionProvincial Assessments In-depth knowledge of systems

13 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 13 Assessment for Learning is a Snapshot Results from a large-scale assessment are a snapshot of student performance. –The results are not definitive. They do not tell the whole story. They need to be considered along with other sources of information available at the school. –The results are more reliable when larger numbers of students participate and when aggregated at the provincial and division level, and should be considered cautiously at the school level. Individual student mastery of learning is best determined through effective and ongoing classroom-based assessment.(Saskatchewan Learning, 2007)

14 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 14 Using a Variety of Data Sources Thinking about the data sources available, their nature and the depth of knowledge they provide, how might the information in each impact the decisions you make? –What can you do with this data? –What is its impact on classrooms? Please refer to the “Using a Variety of Data Sources” template in your handout package as a guide for your discussion.

15 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 15 Local Level Sources of Data While international, national and provincial sources of data can provide direction for school initiatives, the data collected at the local level is what provides the most detailed information regarding the students in classrooms.

16 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 16 Four Major Categories of Data: Demographics Local Data –Descriptive information such as enrollment, attendance, gender, ethnicity, grade level, etc. –Can disaggregate other data by demographic variables. Assessment for Learning –Opportunity-to- Learn Data Home support for –learning –reading, –and learning math. Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education.

17 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 17 Four Major Categories of Data: Student Learning Local Data –Describes outcomes in terms of standardized test results, grade averages, etc. Assessment for Learning Data: –Opportunity-to-Learn Data Know and use reading strategies –Student performance outcomes Math 5,8,20 Reading 4,7,10 Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education.

18 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 18 Four Major Categories of Data: Perceptions Local Data –Provides information regarding what students, parents, staff and community think about school programs and processes. –This is data is important because people act in congruence with what they believe. Assessment for Learning –Opportunity-to- Learn Data Preparation for and commitment to learn Persistence Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education.

19 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 19 Four Major Categories of Data: School Processes Local Data –What the system and teachers are doing to get the results they are getting. –Includes programs, assessments, instructional strategies and classroom practices. Assessment for Learning –Opportunity-to- Learn Data Instruction and learning Availability and use of resources Approaches to problem solving (Math) Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education.

20 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 20 What Data are Useful & Available? Think about the goals/priorities set within your school and/or school division – they might be achievement, growth, behavioural, etc. Using the supplied template, begin to catalogue the data you already have and the data you need in order to better address the goals that have been set. An example follows on the next slide.

21 Questions What data do you have answer questions? What other data do you need to gather? DemographicsEnrollment by subject Number of teachers teaching reading skills Perceptions Student profiles School Community Council feedback Student perception of their success Student LearningCAT3 AFL Student use of reading skills Reading skills explicitly taught in each subject School Processes Current instruction in reading skills Impact of a focused reading skills program Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education. Goal: Students will experience a greater success in all subject areas as we focus on reading skills.

22 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 22 Principles of the Saskatchewan AFL Program 1.Cooperation and Shared Responsibility 2.Equity and Fairness 3.Comprehensiveness 4.Continuous Improvement that Promotes Quality and Excellence 5.Teacher Professionalism 6.Authenticity and Validity 7.Honesty and Openness

23 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 23 Principles of the Saskatchewan AFL Program Read and Connect –Everyone at the table reads the first numbered statement. After reading, one person at the table offers an insight or connection they are making to that statement. –Repeat the process with all seven statements. At the end, discuss the key ideas and concepts within the principles.

24 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 24 Comparators: Types of Referencing Criterion-referenced: Comparing how students perform relative to curriculum objectives, level attribution criteria (rubrics) and the level of difficulty inherent in the assessment tasks. (Tables 8.8 & 8.12) Standards-referenced: Comparing how students performed relative to a set of professionally or socially constructed standards. (Figure 8.2c, Table 8.3, Figure 8.4a, Figure 8.6a, and others.) From: Saskatchewan Learning. (2006). Understanding the Numbers

25 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 25 Comparators: Types of Referencing Experience or self–referenced: Comparing how students perform relative to the assessment data gathered by teachers during the school year. (E.g.. Comparing these results to current school data. The standards set by the panel.) Norm-referenced: Comparing how students in a school performed relative to the performance of students in the division, region or project. (E.g.. Tables comparing the school, division and province.)

26 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 26 Comparators: Types of Referencing Longitudinal-referenced: Comparing how students perform relative to earlier years’ performance of students. (Table 8.1, Figure 8.3 and others.)

27 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 27 What Data are Collected and Reported for Reading? Reading Comprehension Skills: 60-item multiple-choice test (organized by reading strategies). The six categorized reading strategies are: –Using Cueing Systems –Connecting to Prior Knowledge –Making Inferences, Predictions, and Drawing Conclusions –Noting Key & Finding Supporting Ideas –Summarizing, Recalling, synthesizing and Organizing Information –Recognizing Author's Message & Craft

28 28 What Data are Collected and Reported for Reading? Explicit Comprehension: a subset of the 60-item multiple-choice test involving responses to ideas or information stated directly in the text. The answers are “right there” in the text. Implicit Comprehension: a subset of the 60-item multiple-choice test requiring the reader to apply background knowledge to interpret or infer ideas or information in the text. Interpreting vocabulary or visuals and making predictions are forms of inference. Critical Comprehension: a subset of the 60-item multiple-choice test involving responses to ideas and information that require inferences and critical analysis. Looking at author's purpose and point of view, distinguishing facts from opinions and recognizing persuasive techniques are all components. –Numbers of questions in the subsets vary according to grade level Reader Response: written-response question(s) assessing students' ability to make meaning from text by making connections to personal knowledge or experience (extending and applying new understandings

29 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 29 Opportunity to Learn Measures for Reading Data was collected from students in 3 areas: –Preparation and commitment to learn –Knowledge and use of reading strategies –Home support for reading Data was collected from teachers reported on 2 classroom-related elements: –Availability and use of resources –Instruction and learning

30 30 What Data are Collected and Reported for Math? Math Content Skills: 40-item multiple-choice test (organized by mathematical strands and linked to curriculum objectives)... The five strands vary by grade level. Grade 5Grade 8Grade 11 Whole Numbers Fractions Geometry Measurement Data Management Numbers & Operations Ratio & Proportion Geometry & Measurement Algebra Irrational Numbers Consumer Math & Problems Polynomials & Rationals Quadradic Functions & Equations Angles, Polygons and Circles

31 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 31 What Data are Collected and Reported for Math? Applications & Problem Solving Concepts, Procedures & Relationships Challenges – Performance on challenges is reported on a 5 level scale. Calculator Skills Computation Skills Estimation Skills

32 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 32 Opportunity-to-Learn Measures in Math Data was collected from students in 4 areas: –Preparation and commitment to learn –Persistence when experiencing difficulty –Home support for learning in general –Home support for learning math Data was collected from teachers reported on 3 classroom-related measures: –Availability and use of resources –Instruction and learning –Approaches to problem solving

33 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 33 Standards To help make meaningful longitudinal comparisons in future years, three main processes will be implemented. 1.Assessment items will be developed for each assessment cycle using a consistent table of specifications. 2.The assessment items will undergo 3 rounds of field- testing - one of which is intended to inform the comparability of the two assessments. 3.A process for setting of standards for each of the assessment items, so that any differences in difficulty between two assessments are accounted for by varying standards for the two assessments.

34 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 34 Opportunity-to-Learn and Performance Standards In order to establish Opportunity-to-Learn and Performance standards for the 2007 Reading Assessment, three panels were convened (one from each assessed grade), consisting of teachers and post-secondary academics including Education faculty. The panelists studied each assessment item from the 2007 assessment in significant detail established cut-scores for each of the assessment components.

35 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 35 Cut Scores On page 4 of the detailed reports you will find the cut scores detailing the percentage correct required for students to be classified at one of two levels: –Threshold of adequacy –Threshold of proficiency Reader response and Math challenge scores are presented on a five-level scale (1-low to 5-high).

36 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 36

37 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 37

38 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 38 Locating Cut Scores in the Report Turn to pg. 4 in the detailed report for your grade level. Opportunity-to-Learn Elements Performance Component Excellent Standard Sufficient Standard Proficient Standard Adequate Standard A score out of 5 (1-low to 5-high)% correct required to reach the standard You will need to refer to these scores during the following prediction activity.

39 39 Predicting AFL MathOTL – R & MAFL Reading On the chart for your grade level, predict how many students achieved the ADEQUATE Standard on the 2007 AFL in Math for each strand: Math Content Skills Integrated Applications Estimation Skills Calculator Skills Computation Skills On the reverse are the Math Content Skills. On the charts, predict what percentage of students had SUFFICIENT opportunity to Learn in the following areas: Reading: Student preparation and commitment to learn Student knowledge & use of reading strategies Home support for reading Math: Student preparation and commitment to learn Student persistence Home support for learning Home support for learning math On the charts, predict what percentage of students met the ADEQUATE Standard in the following strands: Reading Comprehension Explicit Comprehension Implicit Comprehension Critical Comprehension Reader Response On the reverse are the Reading Comprehension Skills.

40 AFL Math Percentage of Students who met the Adequate Standard set by Saskatchewan Educators

41 Shade in your prediction on the supplied prediction chart. Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC. AFL Math Percentage of Students who met the Adequate Standard set by Saskatchewan Educators

42 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 42 Predicting Based on your predictions, create a set of hypotheses for some or all of them. As you create each prediction, identify the underlying assumptions. Prediction – ‘X’ will contain the highest scores. Assumption – we created a common assessment for ‘X’ in 2005. Prediction – students will report higher on geometry because we moved that unit earlier in the year. Assumption – there are fewer classroom interruptions earlier in the year and students have more time to learn the material. Write each prediction and its accompanying assumption on the cards provided. Please write legibly.

43 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 43 Sharing Gather the cards together at your table and discuss the predictions and assumptions. –Do these statements ring true for everyone at your table? School? Division? –Considering all of the predictions, are there any themes or patterns emerging? Why might this be?

44 44

45 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 45 Comparisons The completed bar graphs are in the Summary Report on page 3. –What are you noticing about the data? –What surprised you? What other data would you like to see to better inform the results you’ve seen so far? Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC.

46 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 46 Using the Prediction & Comparison Process What are the benefits of approaching data in this manner? Hints

47 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 47 Examining the Report Take a few minutes to look through the entire AFL report. Use the chart below to guide your thinking and conversation. Performance Data OTL Data What pops out? Strengths, Areas of Improvement Questions???

48 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 48 Designing Interventions Assumptions must be examined because our interventions will be based on them. We must strive to correctly identify the causal factors. Don’t fall in love with any theory until you have other data. Use a strength-based approach to interventions.

49 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 49 Team Action Plan/Fishbone Team Action Plan What are some areas of strength indicated within your data? What are some areas for improvement indicated within your data? Please consider all aspects of both reports including the Opportunity to Learn Measures. Fishbone At your table, analyze one strength and consider all contributing factors that led to that strength. Consider one area of improvement and transfer those elements from your area of strength (as applicable) that could contribute to improvement in this area.

50 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 50 Focusing on Improvement Thinking of the area of weakness you identified earlier, create a hypothesis that uses some of the positive elements from the area of strength that addresses an area for improvement. –Similar to process lists in math, providing detailed strategy lists for student reading will increase the number of strategies used and therefore increase comprehension.

51 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 51 Different Lenses What other types of data might be required to gain a clearer picture of how specific groups of students are doing? How might this data shed a different light on what has been discovered so far? To do this, Bernhardt’s triangulation process will be used.

52 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 52 Applying Data Collection and Triangulation Using the hypotheses created from the initial examination of the data, complete a more detailed analysis thinking about the four categories of data available – demographics, perceptions, student learning and school processes. You will be furnished with a data intersections template.

53 53 What types of strategies are used in Reading? What data is available or needed to answer the question? What data collection tools will be required? Questions Intersections (Colour in and name the intersection) D P SL SP Adapted from: Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education. Hypothesis: Similar to process lists in math, providing detailed strategy lists for student reading will increase the number of strategies used and therefore increase comprehension. Place the question in the questions column

54 54 Available Data: Processes from the curriculum guide. Processes outlined in the AFL documents. Data Needed: Survey teachers. What types of strategies can be used in Reading? Which are teachers using? (School Processes-SP) What data is available or needed to answer the question? What data collection tools will be required? Questions Intersections (Colour in and name the intersection) D P SL SP Adapted from: Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education. Identify the intersection within the question Hypothesis: Similar to process lists in math, providing detailed strategy lists for student reading will increase the number of strategies used and therefore increase comprehension.

55 55 Available Data: Student reading journal responses Student questionnaire data from AFL Data Needed: Student opinions gathered via questionnaire. What Reading strategies do students seem to favour? (School Processes by Perceptions) What data is available or needed to answer the question? What data collection tools will be required? Questions Intersections (Colour in and name the intersection) D P SL SP Adapted from: Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education. What other intersections and data sources might enrich the analysis? Hypothesis: Similar to process lists in math, providing detailed strategy lists for student reading will increase the number of strategies used and therefore increase comprehension.

56 56 Available Data: Student reading journal responses. Data Needed: Pre and post test data for all reading strategies. Which Reading strategies from the curriculum produce the greatest gains in comprehension for students? Which of those were favoured by students? (School Processes by Perceptions by Student Learning) What data is available or needed to answer the question? What data collection tools will be required? Questions Intersections (Colour in and name the intersection) D P SL SP Adapted from: Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education. What other intersections and data sources might enrich the analysis? Hypothesis: Similar to process lists in math, providing detailed strategy lists for student reading will increase the number of strategies used and therefore increase comprehension.

57 57 Available Data: Student reading journal responses by gender. Data Needed: Pre and post test data for favoured reading strategies by grade level. Which of the Reading strategies from the curriculum produce the greatest gains in comprehension for students? Which of those were favoured? Are certain strategies more powerful at different grade levels? (School Processes by Perceptions by Student Learning by Demographics) What data is available or needed to answer the question? What data collection tools will be required? Questions Intersections (Colour in and name the intersection) D P SL SP Adapted from: Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education. What other intersections and data sources might enrich the analysis? Hypothesis: Similar to process lists in math, providing detailed strategy lists for student reading will increase the number of strategies used and therefore increase comprehension.

58 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 58 Refining Questions Take each hypotheses and write it at the top of a worksheet. Write questions of the hypothesis in the questions column. In the left column, use the diagram to identify the intersections or triangulations implied within the question. –What other intersections would increase the specificity of this question? –If necessary, rewrite the question to reflect the new intersections or triangulations. –Including an “over time” element to a question expands the data that might be accessed to answer the question. For each question identify the existing data source available or what tool will be required to collect it. Create new questions using a variety of the intersections identified earlier.

59 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 59 Refining and Finalizing Questions Complete the process for each question created. When finished, evaluate the quality of the questions then decide which should go forward and which should be abandoned. On the sheet provided, write the hypothesis and the refined questions that your group has decided to keep. These questions will be used to gather data as more in depth goal statements are created. These questions will be used to guide the work in your schools to improve student results.

60 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 60 Advancing Assessment Literacy Modules 17 Modules designed to faciliate conversations and work with data for improvement of instruction. www.spdu.ca –Publications Advancing Assessment Literacy Modules Download a PDF of a PowerPoint and accompanying Lesson Plan for use by education professionals in schools.

61

62 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 62 Reflection Individually, complete the following stems: –Key concepts I have learned... –Implications for me... –Questions I have...

63 Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 63 Assessment: Linking Purposes to Outcomes Bring context and meaning to the math and reading assessment project results; Initiate reflection and discussion among school staff members related to the math and reading assessment results; Encourage school personnel to judiciously review and utilize different comparators when judging math and reading assessment results; Model processes that can be used at the school and division level for building understanding of the data among school staff and the broader community; and, Provide an opportunity to discuss and plan around the data.


Download ppt "Data Interpretation Workshop 2007. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 2 Purposes for the Day Bring context."

Similar presentations


Ads by Google