Presentation is loading. Please wait.

Presentation is loading. Please wait.

Data Interpretation I Workshop 2008 Writing Assessment for Learning.

Similar presentations


Presentation on theme: "Data Interpretation I Workshop 2008 Writing Assessment for Learning."— Presentation transcript:

1 Data Interpretation I Workshop 2008 Writing Assessment for Learning

2 2 Purposes for the Day Bring context and meaning to the writing assessment project results; Initiate reflection and discussion among school staff members related to the writing assessment results; Encourage school personnel to judiciously review and utilize different comparators when judging writing assessment results; Model processes that can be used at the school-and division-level for building understanding of the data among school staff and the broader community; and, Provide an opportunity to discuss and plan around the data

3 3 Agenda Understanding data—sources, categories & uses Provincial Writing Assessment –Conceptual Framework –Comparators –Student Performance Data –Opportunity to Learn Data –Standards and Cut Scores Predicting Categories of Data Action Planning –Linking Data, Goals and Intervention Closure

4 4 Synectics Please complete the following statement: “Data use in schools is like... because...” Data use in schools is like molasses because it is slow and gets slower as it gets colder. Data use in schools is like molasses because it is sticky and can make a big mess!

5 5 A Data-Rich Environment Wellman & Lipton (2004) state: Schools and school districts are rich in data. It is important that the data a group explores are broad enough to offer a rich and deep view of the present state, but not so complex that the process becomes overwhelming and unmanageable. Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC.

6 6 International Data Sources Programme for International Student Assessment (PISA) http://snes.eas.cornell.edu/Graphics/earth%20white%20background.JPG

7 7 National Data Sources Pan-Canadian Achievement Program (PCAP) Canadian Test of Basic Skills (CTBS) Canadian Achievement Tests (CAT3) http://www.recyclage.rncan.gc.ca/images/canada_map.jpg

8 8 Provincial Data Sources Assessment for Learning (AFL) –Opportunity to Learn Measures –Performance Measures Departmentals http://regina.foundlocally.com/Images/Saskatchewan.jpg

9 9 Division Data Sources Division level rubrics Division bench mark assessments http://www.sasked.gov.sk.ca/branches/ed_finance/north_east_sd200.shtml

10 10 Local Data Sources Cum Folders Teacher designed evaluations Portfolios Routine assessment data

11 11 Nature of Assessment Data From Understanding the numbers. Saskatchewan Learning DefinitiveIndicative Individual Classroom School Division Provincial National International Student EvaluationsSystem Evaluations

12 12. Depth and Specificity of Knowledge From Saskatchewan Learning. (2006). Understanding the numbers. Little knowledge of specific students In-depth knowledge of specific students IndividualNationalSchoolClassroomInternationalDivisionProvincial Assessments In-depth knowledge of specific students IndividualNationalSchoolClassroomInternationalDivisionProvincial Assessments In-depth knowledge of systems

13 13 Using a Variety of Data Sources Thinking about the data sources available, their nature and the depth of knowledge they provide, how might the information in each impact the decisions you make? –What can you do with this data? –What is its impact on classrooms?

14 14 Using a Variety of Data Sources Data SourcesUsesImpact on Classroom Provincial AFL Departmental

15 15 Using a Variety of Data Sources Data SourcesUsesImpact on Classroom Provincial AFL Departmental AFL data can be used as a snapshot of achievement at the school, school division and provincial level to inform planning at each level

16 16 Using a Variety of Data Sources Data SourcesUsesImpact on Classroom Provincial AFL Departmental AFL data can be used as a snapshot of achievement at the school, school division and provincial level to inform planning at each level Long term impact on planning as teachers work to capitalize on areas of strength and address areas for improvement Please refer to the “Using a Variety of Data Sources” template on p. 3 in your handout package as a guide for your discussion.

17 17 Assessment for Learning is a Snapshot Results from a large-scale assessment are a snapshot of student performance. –The results are not definitive. They do not tell the whole story. They need to be considered along with other sources of information available at the school. –The results are more reliable when larger numbers of students participate and when aggregated at the provincial and division level, and should be considered cautiously at the school level. Individual student mastery of learning is best determined through effective and ongoing classroom-based assessment.(Saskatchewan Learning, 2008)

18 18 Provincial Writing Assessment: Conceptual Framework – p. 4 & 5 Colourful Thoughts –As you read through the information on the Provincial Writing Assessment, use highlighters or sticky notes to think about your reading: Wow! I agree with this. Hmm! I wonder... Yikes! Adapted from Harvey, S. & Goudvis, A. Strategies that work, 2007.

19 19 Comparators: Types of Referencing – p. 6 Criterion-referenced: Comparing how students perform relative to curriculum objectives, level attribution criteria (rubrics) and the level of difficulty inherent in the assessment tasks. If low percentages of students are succeeding with respect to specific criteria identified in rubrics, this may be an area for further investigation, and for planning intervention to improve student writing. (Detailed rubrics, OTL rubrics and test items can be sourced at www.education.gov.sk.ca) Standards-referenced: Comparing how students performed relative to a set of professionally or socially constructed standards. Results can be compared to these standards to help identify key areas for investigation and intervention. (Figure.2b,.3c,.4a,.6b,.7b and.8b.)

20 20 Comparators: Types of Referencing Experience or self–referenced: Comparing how students perform relative to the assessment data gathered by teachers during the school year. Where discrepancies occur, further investigation or intervention might be considered. It is recommended that several sources of data be considered in planning. (E.g.. Comparing these results to current school data. The standards set by the panel.) Norm-referenced: Comparing how students in a school performed relative to the performance of students in the division, region or project. Note cautions around small groups of students. Norm- reference comparisons contribute very little to determining how to use the assessment information to make improvements. (E.g.. Tables comparing the school, division and province.)

21 21 Comparators: Types of Referencing Longitudinal-referenced: Comparing how students perform relative to earlier years’ performance of students. Viewed across several years, assessment results and other evidence can identify trends and improvements. (This data will not appear until the next administration of this assessment.)

22 22 Opportunity-to-Learn Elements as Reported by Students Propensity to Learn –using resources to explore models, generate ideas and assist the writing process –Motivation, attitude and confidence –Participation, perseverance and completion –Reflection Knowledge and Use of Before, During and After Writing Strategies Home Support for Writing and Learning –Encouragement and interaction –Access to resources and assistance

23 23 Opportunity-to-Learn Elements as Reported by Teachers Availability and Use of Resources –Teacher as key resource Teacher as writer Use of curriculum Educational qualifications Professional development –Time –Student resources Classroom Instruction and Learning –Planning focuses on outcomes –Expectations and criteria are clearly outlined –Variety of assessment techniques –Writing strategies explicitly taught and emphasized –Adaptation

24 24 Student Performance Outcome Results Demonstration of the writing process –Pre-writing –Drafting –Revision Quality of writing product –Messaging and content Focus Understanding and support Genre –Organization and coherence Introduction, conclusion, coherence –Language use Language and word choices Syntax and mechanics

25 25 Standards To help make meaningful longitudinal comparisons in future years, three main processes will be implemented. 1.Assessment items will be developed for each assessment cycle using a consistent table of specifications. 2.The assessment items will undergo field-testing - one purpose of which is intended to inform the comparability of the two assessments. 3.A process for setting of standards for each of the assessment items, so that any differences in difficulty between two assessments are accounted for by varying standards for the two assessments.

26 26 Opportunity-to-Learn and Performance Standards In order to establish Opportunity-to-Learn and Performance standards for the 2008 Writing Assessment, three panels were convened (one from each assessed grade), consisting of teachers from a variety of settings and post- secondary academics including Education faculty. The panelists studied each genre from the 2008 assessment in significant detail and established expectations for writing process, narrative products and expository products as well as opportunity to learn.

27 27 Thresholds of Adequacy and Proficiency BeginningDevelopingAdequateProficientInsightful

28 28 Thresholds of Adequacy and Proficiency Threshold of Adequacy Threshold of Proficiency 1.873.92 Adequate Proficient & Beyond

29 29 Cut Scores On page 4 of the detailed reports you will find the cut scores detailing the percentage correct required for students to be classified at one of two levels: Opportunity-to-Learn Elements Performance Component Excellent Standard Sufficient Standard Proficient Standard Adequate Standard 5 Level ScaleProcess – 3 Level Scale Product – 6 Level Scale

30

31

32 32

33 33 Predicting Card Stack and Shuffle Individually: As you refer to the cut scores on page 4, create a stack of cards with some of your predictions about student outcomes in Narrative and Expository writing – consider each separately. –Writing Process – (Prewriting, drafting, revising) –Writing Product – (Message, organization and language choices) Eg. I predict our 85% of our Gr. 8s will meet the adequate standard or higher in Propensity to Learn and of those, 20% will be proficient or higher because our students are very comfortable with writer’s workshop processes, which we have emphasized for the last three years. Eg. I predict 90% of our Gr. 5s will score adequate or higher on demonstration of writing process in narrative writing because of our whole school emphasis on writing, especially with respect to narrative writing.

34 34 Predicting Card Stack and Shuffle As you complete each card, place it in the center of the table. As a group, shuffle the cards. In turn, each group member picks a card to read aloud to the table group. The group engages in dialogue or discussion about the items. Guiding questions: –With what parts of this prediction do you agree? Why? –With what parts of this prediction do you disagree? Why? –To what extent is this prediction generalizable to all the classrooms in your school?

35 35 Predictions Considering all of the predictions, are there any themes or patterns emerging upon which you can all agree? –Why might this be?

36 36 Comparisons The completed tables are on page 7. –What are you noticing about the data? –What surprised you? Which of your predictions were confirmed? Which of your predictions were not confirmed? Consider your assumptions as you discuss the results. Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC.

37 37 Examining the Report Take a few minutes to look through the entire AFL report. Use the chart below to guide your thinking and conversation. Performance Data OTL Data What pops out? Strengths, Areas of Improvement Questions???

38 38 Please return at 12:40 I’d trade, but peanut butter sticks to my tongue stud.

39 39 Local Level Sources of Data While international, national and provincial sources of data can provide direction for school initiatives, the data collected at the local level is what provides the most detailed information regarding the students in classrooms.

40 40 Four Major Categories of Data: Demographics – p. 7 Local Data –Descriptive information such as enrollment, attendance, gender, ethnicity, grade level, etc. –Can disaggregate other data by demographic variables. AFL –Opportunity-to- Learn Data Family/Home support for student writing –encouragement and interaction –access to resources Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education.

41 41 Four Major Categories of Data: Student Learning Local Data –Describes outcomes in terms of standardized test results, grade averages, etc. AFL –Readiness Related Opportunity-to-Learn Data Using resources to explore writing Student knowledge and use of writing strategies (before, during, after) –Student performance outcomes Writing 5,8,11 – Narrative and Expository –Writing process –Writing product Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education.

42 42 Four Major Categories of Data: Perceptions Local Data –Provides information regarding what students, parents, staff and community think about school programs and processes. –This is data is important because people act in congruence with what they believe. AFL –Readiness Related Opportunity-to-Learn Data Commitment to learn –Using resources –Motivation & attitude –Confidence –Participation –Perseverance & completion –Reflection Knowledge and use of writing strategies Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education.

43 43 Four Major Categories of Data: School Processes Local Data –What the system and teachers are doing to get the results they are getting. –Includes programs, assessments, instructional strategies and classroom practices. AFL –Classroom Related Opportunity-to-Learn Data Instruction and learning –Planning and reflection –Expectations and assessment –Focus on writing strategies –Adaptations Availability and use of resources –Teacher –Time –Resources for students and teachers Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education.

44 44 What Data are Useful and Available? P. 8 Think about the goals/priorities set within your school or school division regarding student writing. Using the supplied template, begin to catalogue the data you already have and the data you need in order to better address the goals that have been set. An example follows on the next slide.

45 Questions What data do you have answer questions? What other data do you need to gather? DemographicsGrade levels teaching writing strategies Number of teachers teaching writing skills Perceptions Student journals regarding writing habits Parent feedback Student perception of their success Student LearningDivision writing benchmarks AFL for Gr. 5, 8, 11 Student use of writing skills Common writing assessments at all grades School Processes Current instructional practice in teaching writing Writing skills explicitly taught in each subject Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education. Goal: Students will consciously use writing strategies for all genres.

46 46 Designing Interventions Assumptions must be examined because our interventions will be based on them. We must strive to correctly identify the causal factors. Don’t fall in love with any theory until you have other data. Use a strength-based approach to interventions.

47 47 Team Action Plan Please turn to page 9 in your handout package. What are some areas of strength indicated within your data? What are some areas for improvement indicated within your data? Please consider all aspects of the report including the Opportunity to Learn Measures.

48 48 Fishbone Analysis: Strengths - p 10 At your table, analyze one strength and consider all contributing factors that led to that strength. Writing Process All classrooms using Writers’ Workshop PLC read Strategies that Work Majority of PD focused on writing Teachers explicitly teaching pre-writing strategy in all subjects

49 49 Fishbone Analysis: Area for Improvement – p. 11 Identify one area for improvement. What elements from your area of strength could contribute to improvement in this area? –Eg. We did well in the process of writing because all teachers are explicitly teaching pre-writing across the curriculum with every writing activity –So, we need to explicitly teach how to write introductions, conclusions, and transitions in writing in all subject areas

50 50 Setting a Goal – p. 12 Based on your previous discussions regarding strengths and areas for improvement, write a goal statement your team will work on over the coming year. Eg. For the 2010 AFL in Writing, all students will score at level 4 and above with respect to their use of before, during and after writing strategies. Write your goal on the provided bubble map. This is a template – add more bubbles if you need them! You do not have to fill in all the bubbles. Brainstorm possible strategies for meeting that goal. You may need to use different strategies at different grade levels.

51 51 Research Instructional Strategies P. 13 Once you have completed brainstorming strategies, you will want to conduct some research on the effectiveness of those strategies. Available resources could include a variety of websites, the professional collection at the Stewart Resources Centre and the McDowell Foundation (www.stf.sk.ca).

52 52 Impact/Feasibility – p 14 Once you have completed your research, conduct an impact/feasibility analysis of the strategies you have identified. Impact refers to the degree to which a strategy will make a difference in the learning of students. A high impact strategy will make the greatest difference in learning for the broadest population of students. Feasibility refers to the practical supports that need to be in place such as time, funding, scheduling, etc. StrategyImpactFeasibility Activate prior knowledge before writing new texts. High Adopt new curriculum materials MediumLow Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press. When done, choose the strategy that will have the greatest impact and is most feasible to implement.

53 Data-Driven Decision Making Improvement Cycle – p. 16 1. Find the data – “Treasure Hunt” 5. Identify Specific Strategies to Achieve Goals 4. Goal Setting and Revision 3. Needs Analysis 2. Data Analysis and Strength Finder 7. Action Plan, Schedule, REVIEW 6. Determine Results Indicators (White, 2005)

54 54 Four Tasks of Action Planning P. 17 1.Decide on strategies for improvement. 2.Agree on what your plan will look like in classrooms. 3.Put the plan down on paper. 4.Plan how you will know if the plan is working. Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

55 55 Put the Plan Down on Paper By documenting team members’ roles and responsibilities and specifying the concrete steps that need to occur, you build internal accountability for making the plan work. Identifying the professional development time and instruction your team will need and including it in your action plan lets teachers know they will be supported through the process of instructional improvement. Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

56 56 Writing Out The Plan – p. 18 Using the supplied “Action Plan” template, begin to draft the details of the plan as you work to achieving your goal. The supplied template is only a suggestion – you may create your own or use another of your own design. Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

57 57 Plan How You Will Know if the Plan is Working Before implementing your plan, it is important to determine what type of data you will need to collect in order to understand whether students are moving towards the goal. Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

58 58 Different Lenses – p. 20 What types of data might be required to gain a clearer picture of how specific groups of students are doing? –Consider the four categories of data available – demographics, perceptions, student learning and school processes – as you explore what types of data you need.

59 59 Short-, Medium-, and Long-Term Data Short-Term Data –Gathered daily or weekly via classroom assessments and/or observations. Medium-Term Data –Gathered at periodic intervals via common department, school, or division assessments. These are usually referred to as benchmark assessments. Long-Term Data –Gathered annually via standardized provincial, national, or international assessments. Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

60 60 Short- and Medium-Term Assessments Referring to your action plan, identify what types of short- and medium-term assessments would best measure the progress of students as they work toward the goal. It may be useful to plan the medium-term assessments first to provide a framework within which short-term assessments would fit. Use the provided Short- and Medium-Term Assessment Planning template to plan when these might be administered.

61 61 Short-, Medium-, and Long-Term Assessments Your school or school division has set a goal to improve students’ quality of writing, particularly as it relates to organization and coherence. Teachers’ in-class assessment strategies provide formative feedback to students in these areas – writing effective introductions and conclusions, as well as transitions. Writing benchmark prompts are developed for each grade level in the school and administered at the end of each reporting period. Teachers collaboratively grade the papers using the rubrics from the Assessment for Learning program and analyze the results together. –Following the common assessment, students who have not achieved the set benchmark receive additional instruction and formative assessment as they work towards the goal. In 2010 students are again assessed on their writing with the provincial AFL program.

62 62 Advancing Assessment Literacy Modules – p. 21 17 Modules designed to facilitate conversations and work with data for improvement of instruction. www.spdu.ca –Publications Advancing Assessment Literacy Modules Download a PDF of a PowerPoint and accompanying Lesson Plan for use by education professionals in schools. The PPT of this workshop will also be available on the same site.

63

64 64 Reflection What did you discover today that surprised you? What will you take with you from today?

65 65 Evaluation Bring context and meaning to the writing assessment project results; Initiate reflection and discussion among school staff members related to the writing assessment results; Encourage school personnel to judiciously review and utilize different comparators when judging writing assessment results; Model processes that can be used at the school and division level for building understanding of the data among school staff and the broader community; and, Provide an opportunity to discuss and plan around the data.


Download ppt "Data Interpretation I Workshop 2008 Writing Assessment for Learning."

Similar presentations


Ads by Google