Data Interpretation I Workshop 2008 Writing Assessment for Learning.

Slides:



Advertisements
Similar presentations
School Based Assessment and Reporting Unit Curriculum Directorate
Advertisements

September 2013 The Teacher Evaluation and Professional Growth Program Module 2: Student Learning Objectives.
Preparing for the Data Team Process 1.  Know the rationale for “Step A” with respect to the data team process.  Experience Step A as a tool to help.
Consistency of Assessment
Authentic Assessment Abdelmoneim A. Hassan. Welcome Authentic Assessment Qatar University Workshop.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Advancing Assessment Literacy Data Informed Decision Making IV: Monitoring and Assessing Progress.
Looking at Student work to Improve Learning
Principles of Assessment
Advancing Assessment Literacy Data Analysis II: Examining & Interpreting Data.
1 Let’s Meet! October 13,  All four people have to run.  The baton has to be held and passed by all participants.  You can have world class speed.
Introduction to digiCOACH Empowering Instructional Leaders Common Core Edition.
Data Interpretation Workshop Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) 2 Purposes for the Day Bring context.
Advancing Assessment Literacy Data Informed Decision Making III: Creating Action Plans.
One Voice – One Plan Office of Education Improvement and Innovation MI-CSI: Do Stage Implement Plan and Monitor Plan.
Advancing Assessment Literacy Data Gathering I: Establishing Outcomes.
The DATA WISE Process and Data- Driven Dialogue Presented by: Lori DeForest (315)
Adolescent Literacy – Professional Development
5-Step Process Clarification The 5-Step Process is for a unit, topic, or “chunk” of information. One form should be used for the unit, topic, etc. The.
Curriculum and Learning Omaha Public Schools
Data Interpretation II Workshop 2008 Writing Assessment for Learning.
Compass: Module 2 Compass Requirements: Teachers’ Overall Evaluation Rating Student Growth Student Learning Targets (SLTs) Value-added Score (VAM) where.
Classroom Assessments Checklists, Rating Scales, and Rubrics
PLMLC Leadership Series London Region Day 1 Ellen Walters, YCDSB Shelley Yearley, TLDSB Monday February 28, 2011.
Advancing Assessment Literacy Data Gathering III: Identifying & Valuing Different Types of Data.
Data for Student Success Using Classroom Data to Monitor Student Progress “It is about focusing on building a culture of quality data through professional.
School Improvement Planning Today’s Session Review the purpose of SI planning Review the components of SI plans Discuss changes to SI planning.
Advancing Assessment Literacy Setting the Stage I: Engaging Stakeholders.
Literacy Plan Kara Klokis and Carol Pippen Longwood University.
Planning and Integrating Curriculum: Unit 4, Key Topic 1http://facultyinitiative.wested.org/1.
System Implementation and Monitoring Regional Session Spring, 2014 Resources are available at sim.abel.yorku.ca.
Advancing Assessment Literacy Data Gathering IV: Collecting and Collating Data.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
Advancing Assessment Literacy Setting the Stage II: Understanding Data Purposes & Uses.
Issues in Selecting Assessments for Measuring Outcomes for Young Children Issues in Selecting Assessments for Measuring Outcomes for Young Children Dale.
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
Professional Development PLC Lead Training Cultural Shifts: Rethinking what we do and why we do it Together, we can make a difference.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
Workshops to support the implementation of the new languages syllabuses in Years 7-10.
+ The continuum Summative assessment Next steps. Gallery Walk – the Bigger Picture Take one post it of each of the 3 colours. Walk around a look at the.
Advancing Assessment Literacy Data Analysis III: Extending the Assessment.
Scale Scoring A New Format for Provincial Assessment Reports.
PLC Team Leader Meeting
Candidate Assessment of Performance Conducting Observations and Providing Meaningful Feedback Workshop for Program Supervisors and Supervising Practitioners.
Data Analysis Processes: Cause and Effect Linking Data Analysis Processes to Teacher Evaluation Name of School.
Literacy Plan Kara Klokis and Carol Pippen Longwood University.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Student Learning Objectives (SLO) Resources for Science 1.
TEACHER EVALUATION IMPLEMENTATION DAY: STUDENT GROWTH AND GOAL SETTING September 25, 2015 Shorewood High School 9/25/15 1.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
1 One Common Voice – One Plan School Improvement Module 3 Study: Analyze Data Set Goals and Measurable Objectives Research Best Practice.
Planning for Continuous School Improvement
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Examining Student Work Middle School Math Teachers District SIP Day January 27, 2016.
Purpose of Teacher Evaluation and Observation Minnesota Teacher Evaluation Requirements Develop, improve and support qualified teachers and effective.
CEIT 225 Instructional Design Prof. Dr. Kürşat Çağıltay
DECISION-MAKING FOR RESULTS HSES- Data Team Training.
PLCs Professional Learning Communities Staff PD. Professional Learning Committees The purpose of our PLCs includes but is not limited to: teacher collaborationNOT-
Richard Woods, Georgia’s School Superintendent “Educating Georgia’s Future” gadoe.org Quality Comprehensive Improvement System Key School Performance Standards.
Spelling and beyond – Curriculum
Setting Objectives and Providing Feedback
Consider Your Audience
Spelling and beyond Literacy Toolkit HGIOS
PLCs Professional Learning Communities Staff PD
Developing Thinking Thinking Skills for 21st century learners Literacy
Unit 7: Instructional Communication and Technology
Examining Student Work
Presentation transcript:

Data Interpretation I Workshop 2008 Writing Assessment for Learning

2 Purposes for the Day Bring context and meaning to the writing assessment project results; Initiate reflection and discussion among school staff members related to the writing assessment results; Encourage school personnel to judiciously review and utilize different comparators when judging writing assessment results; Model processes that can be used at the school-and division-level for building understanding of the data among school staff and the broader community; and, Provide an opportunity to discuss and plan around the data

3 Agenda Understanding data—sources, categories & uses Provincial Writing Assessment –Conceptual Framework –Comparators –Student Performance Data –Opportunity to Learn Data –Standards and Cut Scores Predicting Categories of Data Action Planning –Linking Data, Goals and Intervention Closure

4 Synectics Please complete the following statement: “Data use in schools is like... because...” Data use in schools is like molasses because it is slow and gets slower as it gets colder. Data use in schools is like molasses because it is sticky and can make a big mess!

5 A Data-Rich Environment Wellman & Lipton (2004) state: Schools and school districts are rich in data. It is important that the data a group explores are broad enough to offer a rich and deep view of the present state, but not so complex that the process becomes overwhelming and unmanageable. Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC.

6 International Data Sources Programme for International Student Assessment (PISA)

7 National Data Sources Pan-Canadian Achievement Program (PCAP) Canadian Test of Basic Skills (CTBS) Canadian Achievement Tests (CAT3)

8 Provincial Data Sources Assessment for Learning (AFL) –Opportunity to Learn Measures –Performance Measures Departmentals

9 Division Data Sources Division level rubrics Division bench mark assessments

10 Local Data Sources Cum Folders Teacher designed evaluations Portfolios Routine assessment data

11 Nature of Assessment Data From Understanding the numbers. Saskatchewan Learning DefinitiveIndicative Individual Classroom School Division Provincial National International Student EvaluationsSystem Evaluations

12. Depth and Specificity of Knowledge From Saskatchewan Learning. (2006). Understanding the numbers. Little knowledge of specific students In-depth knowledge of specific students IndividualNationalSchoolClassroomInternationalDivisionProvincial Assessments In-depth knowledge of specific students IndividualNationalSchoolClassroomInternationalDivisionProvincial Assessments In-depth knowledge of systems

13 Using a Variety of Data Sources Thinking about the data sources available, their nature and the depth of knowledge they provide, how might the information in each impact the decisions you make? –What can you do with this data? –What is its impact on classrooms?

14 Using a Variety of Data Sources Data SourcesUsesImpact on Classroom Provincial AFL Departmental

15 Using a Variety of Data Sources Data SourcesUsesImpact on Classroom Provincial AFL Departmental AFL data can be used as a snapshot of achievement at the school, school division and provincial level to inform planning at each level

16 Using a Variety of Data Sources Data SourcesUsesImpact on Classroom Provincial AFL Departmental AFL data can be used as a snapshot of achievement at the school, school division and provincial level to inform planning at each level Long term impact on planning as teachers work to capitalize on areas of strength and address areas for improvement Please refer to the “Using a Variety of Data Sources” template on p. 3 in your handout package as a guide for your discussion.

17 Assessment for Learning is a Snapshot Results from a large-scale assessment are a snapshot of student performance. –The results are not definitive. They do not tell the whole story. They need to be considered along with other sources of information available at the school. –The results are more reliable when larger numbers of students participate and when aggregated at the provincial and division level, and should be considered cautiously at the school level. Individual student mastery of learning is best determined through effective and ongoing classroom-based assessment.(Saskatchewan Learning, 2008)

18 Provincial Writing Assessment: Conceptual Framework – p. 4 & 5 Colourful Thoughts –As you read through the information on the Provincial Writing Assessment, use highlighters or sticky notes to think about your reading: Wow! I agree with this. Hmm! I wonder... Yikes! Adapted from Harvey, S. & Goudvis, A. Strategies that work, 2007.

19 Comparators: Types of Referencing – p. 6 Criterion-referenced: Comparing how students perform relative to curriculum objectives, level attribution criteria (rubrics) and the level of difficulty inherent in the assessment tasks. If low percentages of students are succeeding with respect to specific criteria identified in rubrics, this may be an area for further investigation, and for planning intervention to improve student writing. (Detailed rubrics, OTL rubrics and test items can be sourced at Standards-referenced: Comparing how students performed relative to a set of professionally or socially constructed standards. Results can be compared to these standards to help identify key areas for investigation and intervention. (Figure.2b,.3c,.4a,.6b,.7b and.8b.)

20 Comparators: Types of Referencing Experience or self–referenced: Comparing how students perform relative to the assessment data gathered by teachers during the school year. Where discrepancies occur, further investigation or intervention might be considered. It is recommended that several sources of data be considered in planning. (E.g.. Comparing these results to current school data. The standards set by the panel.) Norm-referenced: Comparing how students in a school performed relative to the performance of students in the division, region or project. Note cautions around small groups of students. Norm- reference comparisons contribute very little to determining how to use the assessment information to make improvements. (E.g.. Tables comparing the school, division and province.)

21 Comparators: Types of Referencing Longitudinal-referenced: Comparing how students perform relative to earlier years’ performance of students. Viewed across several years, assessment results and other evidence can identify trends and improvements. (This data will not appear until the next administration of this assessment.)

22 Opportunity-to-Learn Elements as Reported by Students Propensity to Learn –using resources to explore models, generate ideas and assist the writing process –Motivation, attitude and confidence –Participation, perseverance and completion –Reflection Knowledge and Use of Before, During and After Writing Strategies Home Support for Writing and Learning –Encouragement and interaction –Access to resources and assistance

23 Opportunity-to-Learn Elements as Reported by Teachers Availability and Use of Resources –Teacher as key resource Teacher as writer Use of curriculum Educational qualifications Professional development –Time –Student resources Classroom Instruction and Learning –Planning focuses on outcomes –Expectations and criteria are clearly outlined –Variety of assessment techniques –Writing strategies explicitly taught and emphasized –Adaptation

24 Student Performance Outcome Results Demonstration of the writing process –Pre-writing –Drafting –Revision Quality of writing product –Messaging and content Focus Understanding and support Genre –Organization and coherence Introduction, conclusion, coherence –Language use Language and word choices Syntax and mechanics

25 Standards To help make meaningful longitudinal comparisons in future years, three main processes will be implemented. 1.Assessment items will be developed for each assessment cycle using a consistent table of specifications. 2.The assessment items will undergo field-testing - one purpose of which is intended to inform the comparability of the two assessments. 3.A process for setting of standards for each of the assessment items, so that any differences in difficulty between two assessments are accounted for by varying standards for the two assessments.

26 Opportunity-to-Learn and Performance Standards In order to establish Opportunity-to-Learn and Performance standards for the 2008 Writing Assessment, three panels were convened (one from each assessed grade), consisting of teachers from a variety of settings and post- secondary academics including Education faculty. The panelists studied each genre from the 2008 assessment in significant detail and established expectations for writing process, narrative products and expository products as well as opportunity to learn.

27 Thresholds of Adequacy and Proficiency BeginningDevelopingAdequateProficientInsightful

28 Thresholds of Adequacy and Proficiency Threshold of Adequacy Threshold of Proficiency Adequate Proficient & Beyond

29 Cut Scores On page 4 of the detailed reports you will find the cut scores detailing the percentage correct required for students to be classified at one of two levels: Opportunity-to-Learn Elements Performance Component Excellent Standard Sufficient Standard Proficient Standard Adequate Standard 5 Level ScaleProcess – 3 Level Scale Product – 6 Level Scale

32

33 Predicting Card Stack and Shuffle Individually: As you refer to the cut scores on page 4, create a stack of cards with some of your predictions about student outcomes in Narrative and Expository writing – consider each separately. –Writing Process – (Prewriting, drafting, revising) –Writing Product – (Message, organization and language choices) Eg. I predict our 85% of our Gr. 8s will meet the adequate standard or higher in Propensity to Learn and of those, 20% will be proficient or higher because our students are very comfortable with writer’s workshop processes, which we have emphasized for the last three years. Eg. I predict 90% of our Gr. 5s will score adequate or higher on demonstration of writing process in narrative writing because of our whole school emphasis on writing, especially with respect to narrative writing.

34 Predicting Card Stack and Shuffle As you complete each card, place it in the center of the table. As a group, shuffle the cards. In turn, each group member picks a card to read aloud to the table group. The group engages in dialogue or discussion about the items. Guiding questions: –With what parts of this prediction do you agree? Why? –With what parts of this prediction do you disagree? Why? –To what extent is this prediction generalizable to all the classrooms in your school?

35 Predictions Considering all of the predictions, are there any themes or patterns emerging upon which you can all agree? –Why might this be?

36 Comparisons The completed tables are on page 7. –What are you noticing about the data? –What surprised you? Which of your predictions were confirmed? Which of your predictions were not confirmed? Consider your assumptions as you discuss the results. Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC.

37 Examining the Report Take a few minutes to look through the entire AFL report. Use the chart below to guide your thinking and conversation. Performance Data OTL Data What pops out? Strengths, Areas of Improvement Questions???

38 Please return at 12:40 I’d trade, but peanut butter sticks to my tongue stud.

39 Local Level Sources of Data While international, national and provincial sources of data can provide direction for school initiatives, the data collected at the local level is what provides the most detailed information regarding the students in classrooms.

40 Four Major Categories of Data: Demographics – p. 7 Local Data –Descriptive information such as enrollment, attendance, gender, ethnicity, grade level, etc. –Can disaggregate other data by demographic variables. AFL –Opportunity-to- Learn Data Family/Home support for student writing –encouragement and interaction –access to resources Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education.

41 Four Major Categories of Data: Student Learning Local Data –Describes outcomes in terms of standardized test results, grade averages, etc. AFL –Readiness Related Opportunity-to-Learn Data Using resources to explore writing Student knowledge and use of writing strategies (before, during, after) –Student performance outcomes Writing 5,8,11 – Narrative and Expository –Writing process –Writing product Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education.

42 Four Major Categories of Data: Perceptions Local Data –Provides information regarding what students, parents, staff and community think about school programs and processes. –This is data is important because people act in congruence with what they believe. AFL –Readiness Related Opportunity-to-Learn Data Commitment to learn –Using resources –Motivation & attitude –Confidence –Participation –Perseverance & completion –Reflection Knowledge and use of writing strategies Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education.

43 Four Major Categories of Data: School Processes Local Data –What the system and teachers are doing to get the results they are getting. –Includes programs, assessments, instructional strategies and classroom practices. AFL –Classroom Related Opportunity-to-Learn Data Instruction and learning –Planning and reflection –Expectations and assessment –Focus on writing strategies –Adaptations Availability and use of resources –Teacher –Time –Resources for students and teachers Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education.

44 What Data are Useful and Available? P. 8 Think about the goals/priorities set within your school or school division regarding student writing. Using the supplied template, begin to catalogue the data you already have and the data you need in order to better address the goals that have been set. An example follows on the next slide.

Questions What data do you have answer questions? What other data do you need to gather? DemographicsGrade levels teaching writing strategies Number of teachers teaching writing skills Perceptions Student journals regarding writing habits Parent feedback Student perception of their success Student LearningDivision writing benchmarks AFL for Gr. 5, 8, 11 Student use of writing skills Common writing assessments at all grades School Processes Current instructional practice in teaching writing Writing skills explicitly taught in each subject Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2 nd Edition. Larchmont, NY: Eye on Education. Goal: Students will consciously use writing strategies for all genres.

46 Designing Interventions Assumptions must be examined because our interventions will be based on them. We must strive to correctly identify the causal factors. Don’t fall in love with any theory until you have other data. Use a strength-based approach to interventions.

47 Team Action Plan Please turn to page 9 in your handout package. What are some areas of strength indicated within your data? What are some areas for improvement indicated within your data? Please consider all aspects of the report including the Opportunity to Learn Measures.

48 Fishbone Analysis: Strengths - p 10 At your table, analyze one strength and consider all contributing factors that led to that strength. Writing Process All classrooms using Writers’ Workshop PLC read Strategies that Work Majority of PD focused on writing Teachers explicitly teaching pre-writing strategy in all subjects

49 Fishbone Analysis: Area for Improvement – p. 11 Identify one area for improvement. What elements from your area of strength could contribute to improvement in this area? –Eg. We did well in the process of writing because all teachers are explicitly teaching pre-writing across the curriculum with every writing activity –So, we need to explicitly teach how to write introductions, conclusions, and transitions in writing in all subject areas

50 Setting a Goal – p. 12 Based on your previous discussions regarding strengths and areas for improvement, write a goal statement your team will work on over the coming year. Eg. For the 2010 AFL in Writing, all students will score at level 4 and above with respect to their use of before, during and after writing strategies. Write your goal on the provided bubble map. This is a template – add more bubbles if you need them! You do not have to fill in all the bubbles. Brainstorm possible strategies for meeting that goal. You may need to use different strategies at different grade levels.

51 Research Instructional Strategies P. 13 Once you have completed brainstorming strategies, you will want to conduct some research on the effectiveness of those strategies. Available resources could include a variety of websites, the professional collection at the Stewart Resources Centre and the McDowell Foundation (

52 Impact/Feasibility – p 14 Once you have completed your research, conduct an impact/feasibility analysis of the strategies you have identified. Impact refers to the degree to which a strategy will make a difference in the learning of students. A high impact strategy will make the greatest difference in learning for the broadest population of students. Feasibility refers to the practical supports that need to be in place such as time, funding, scheduling, etc. StrategyImpactFeasibility Activate prior knowledge before writing new texts. High Adopt new curriculum materials MediumLow Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press. When done, choose the strategy that will have the greatest impact and is most feasible to implement.

Data-Driven Decision Making Improvement Cycle – p Find the data – “Treasure Hunt” 5. Identify Specific Strategies to Achieve Goals 4. Goal Setting and Revision 3. Needs Analysis 2. Data Analysis and Strength Finder 7. Action Plan, Schedule, REVIEW 6. Determine Results Indicators (White, 2005)

54 Four Tasks of Action Planning P Decide on strategies for improvement. 2.Agree on what your plan will look like in classrooms. 3.Put the plan down on paper. 4.Plan how you will know if the plan is working. Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

55 Put the Plan Down on Paper By documenting team members’ roles and responsibilities and specifying the concrete steps that need to occur, you build internal accountability for making the plan work. Identifying the professional development time and instruction your team will need and including it in your action plan lets teachers know they will be supported through the process of instructional improvement. Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

56 Writing Out The Plan – p. 18 Using the supplied “Action Plan” template, begin to draft the details of the plan as you work to achieving your goal. The supplied template is only a suggestion – you may create your own or use another of your own design. Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

57 Plan How You Will Know if the Plan is Working Before implementing your plan, it is important to determine what type of data you will need to collect in order to understand whether students are moving towards the goal. Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

58 Different Lenses – p. 20 What types of data might be required to gain a clearer picture of how specific groups of students are doing? –Consider the four categories of data available – demographics, perceptions, student learning and school processes – as you explore what types of data you need.

59 Short-, Medium-, and Long-Term Data Short-Term Data –Gathered daily or weekly via classroom assessments and/or observations. Medium-Term Data –Gathered at periodic intervals via common department, school, or division assessments. These are usually referred to as benchmark assessments. Long-Term Data –Gathered annually via standardized provincial, national, or international assessments. Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

60 Short- and Medium-Term Assessments Referring to your action plan, identify what types of short- and medium-term assessments would best measure the progress of students as they work toward the goal. It may be useful to plan the medium-term assessments first to provide a framework within which short-term assessments would fit. Use the provided Short- and Medium-Term Assessment Planning template to plan when these might be administered.

61 Short-, Medium-, and Long-Term Assessments Your school or school division has set a goal to improve students’ quality of writing, particularly as it relates to organization and coherence. Teachers’ in-class assessment strategies provide formative feedback to students in these areas – writing effective introductions and conclusions, as well as transitions. Writing benchmark prompts are developed for each grade level in the school and administered at the end of each reporting period. Teachers collaboratively grade the papers using the rubrics from the Assessment for Learning program and analyze the results together. –Following the common assessment, students who have not achieved the set benchmark receive additional instruction and formative assessment as they work towards the goal. In 2010 students are again assessed on their writing with the provincial AFL program.

62 Advancing Assessment Literacy Modules – p Modules designed to facilitate conversations and work with data for improvement of instruction. –Publications Advancing Assessment Literacy Modules Download a PDF of a PowerPoint and accompanying Lesson Plan for use by education professionals in schools. The PPT of this workshop will also be available on the same site.

64 Reflection What did you discover today that surprised you? What will you take with you from today?

65 Evaluation Bring context and meaning to the writing assessment project results; Initiate reflection and discussion among school staff members related to the writing assessment results; Encourage school personnel to judiciously review and utilize different comparators when judging writing assessment results; Model processes that can be used at the school and division level for building understanding of the data among school staff and the broader community; and, Provide an opportunity to discuss and plan around the data.