Presentation is loading. Please wait.

Presentation is loading. Please wait.

“ “ Assessment Where we’ll be by the end of our day—

Similar presentations


Presentation on theme: "“ “ Assessment Where we’ll be by the end of our day—"— Presentation transcript:

1 “ “ Assessment Where we’ll be by the end of our day—
We have not succeeded in answering all our questions. In some ways we are as confused as ever, but we believe we are confused on a higher level and about more important things. Possibly, Enrico Fermi

2 Assessments will have to be “tuned” to how we now teach
Shifts in teaching > Shifts in learning opportunities > Shifts in assessment

3 GAPS? What do we already know about summative assessment?
What’s a gap/challenge I have about constructing, administering, summative assessments? GAPS?

4 Different tools: Formative vs Summative
Teachers use formative assessments to collect information they need to guide their instruction and that students need to improve their learning. A formative assessment might be as simple as a quick comprehension check, a pop quiz, a a classroom discussion, exit tickets, etc. • check on individual students’ understanding of what you are teaching; • get insight into students’ thinking about science concepts and identify misunderstandings; • make decisions about reteaching material or pacing the instruction, or help students evaluate and revise their own work. Teachers use summative assessments at the end of a unit or a course. Typically a test, project portfolio, performance assessment, essays. • assigning grades to individual students; • informing parents about their children’s progress; • providing evidence of how well a change—such as a revised curriculum, a new approach to professional development, or some other policy—has worked.

5 Vocabulary ITEM FORMATS Multiple choice
Item types on WCAS test: • Multiple choice • Multiple select • Editing task with choice • Hot text • Table input • Simulation response • Grid items (drag and drop) • Short answer ITEM FORMATS Multiple choice Constructed response (short and extended) Short and extended performance tasks HOW ASSESSMENT RELATES TO CURRICULUM Curriculum-embedded Stand-alone Tasks that extend what was in curriculum (Yellowstone > Climate change)

6 Measuring three-dimensional learning: it develops over time
Science and engineering practices Use a variety of tasks and challenges to give students multiple opportunities to learn & ways to demonstrate what they have learned Example with argument: formal instruction about the parts of and reasons for this practice, students rehearse and get feedback on different parts of arg., embedding components of argument in lab activities, teacher makes her thinking explicit when she is modeling argument, using claims, evidence as part of exit slips, entry tasks, integrating argument in performance assessments, summative assessments… Disciplinary core ideas Cross-cutting concepts

7 Science practices, what are their components?
Create your own practice-based assessment items? Or seek out & modify existing items?

8 Poorly designed summative assessments
In groups of 3, read, analyze, be prepared to share out what you see as significant.

9 Non 3D assessments Figure 4 presents the problem of oil spills and their economic and ecological impacts, and shows some data relevant to the problem. But item (a) asks students to interpret the graph without using any content knowledge. Moreover, this basic level of interpretation, reading a bar graph, would not be considered aligned to a high school practice. Items (b)-(f) demand only content knowledge with no data analysis or other science practices.

10 Non 3D assessments Although the task in Figure 5 (Ohio, Grade 8) presents physical science content about the motion of a pendulum in the context of an investigation, the task is not multidimensional. Indeed, a student could identify a pattern on the data table without knowing about the phenomenon being investigated. Furthermore, there is a second single-dimensional route to the answer. A student who knew the physical science concept being probed could answer the question without the data table, which would make it single dimensional but drawing only on content knowledge. Although data analysis is one of the science and engineering practices, alignment to that practice should require more than selecting the correct answer based on a pattern in a data table; for instance, if this task asked students to analyze the data and to cite these data to describe a pattern, it would become a two- dimensional item (identify a pattern is a cross-cutting concept).

11 Non 3D assessments (I disagree with commentary on this)
Assessments can encourage and help communicate this focus on deep learning of a smaller set of ideas if they closely adhere to the boundaries around each core idea. However, many existing items fall outside of these boundaries. The item shown in Figure 11 asks students to label three layers of the earth, recall a characteristic of each layer, and name a way that two of those layers interact. The closest DCI, ESS2.A (below), is about the flow of energy and matter as a driver of large earth system processes. Knowledge of the names of the layers of the earth might be useful for discussing these system processes, but focusing on the detailed factual knowledge required for this question distracts from the more significant objectives of performance expectation MS-ESS1-4 shown beneath.

12 3D assessments In this task, students watch a short video of a phenomenon, dye-coated candies put into water at different temperatures. Students draw models and write an explanation to show why the dye on the candies spread differently at the different temperatures. This two-part task requires that students use their physical science knowledge to develop a model that shows the cause of a phenomenon (DCI and science practice) and to construct a written explanation for the phenomenon (DCI, science practice). Also embedded in this task is a cross-cutting concept, cause and effect: mechanism and explanation. In a fully three- dimensional task this element of the task would not be embedded, but would ask students explicitly to include the mechanism that caused the phenomenon they observed.

13 3D assessments An example of a task that deeply explores a foundational concept is shown in Figure 12. In this task, students clarify an argument about what makes bubbles in boiling water. Students also combine content with elements of argumentation by articulating the underlying reasoning of an argument and constructing a counter-argument using evidence. This task adheres closely to probing only content required to evaluate students’ proficiency with the big idea central to the DCI PS1.A: Structure and Properties of Matter.

14 Using tasks for both instruction & assessment: Climate change / High school
Practices: Analyzing and interpreting data; Using a model to predict phenomena Crosscutting concepts: Systems and system models Disciplinary core ideas: Ecosystems: Interactions, energy, and dynamics [LS2]; Earth and human activity [ESS3-5] This example illustrates a way to adapt a task so that it can be used for instruction and then, later, to assess learning. It uses computer software that makes it possible to enhance the activity in interesting ways, but the ideas could be applied without this technology. Question for students: In Future 3, would climate change impact your focal species?

15 Climate change: High school assessment
Partner up and read/skim p. 74 & 75 How might the resources and challenges described on these two pages be used for instruction AND assessment? Look at Pp : Can you “steal” the right side of the rubric for your own use in ANY task where you ask for explanations? (SEP rubrics are in our shared folder) No need to read p. 79

16 Step 2: Identify or create Performance Expectation
Step 1: Will you use activities, materials, scenario that your students have already been studying? Or create a new set? Step 4: Create scenario, with information, instructions, resources, scaffolding the student will use Step 4: Create question(s) or prompts(s) aligned with practice and DCIs Step 3: Identify specific parts of practice and DCI that will be “built into” assessment questions Step 2: Identify or create Performance Expectation

17 Step 5: (this rubric is only for Question 1) Create rubric that has multiple criteria aligned to the DCI and science practice, and then levels of performance for each criteria. Step 6: Create possible student responses at each performance level. This helps scoring. Step 7: What additional test items must be developed to determine if students understand particular DCIs?

18 BOX 2-1 SCORING RUBRIC FOR ASSESSMENT TASK
Solving the mystery: Inspector Bio wants to know what you have figured out about the oxygen that is missing from the air you exhale. Explain to her where the oxygen goes, what uses it, and why. Write a scientific explanation with a claim, sufficient evidence, and reasoning. Level 0: Missing or only generic reasons for survival (e.g., to breathe, for living) Level 1: Oxygen used to get energy or used with food for energy; no physical science mechanism presented to get energy Level 2: Oxygen used in chemical reaction (or "burning'') to get energy, but an incomplete description of the physical science ideas of matter and energy (e.g., ''burns the oxygen" without mentioning food or glucose or "react with glucose" but no account of energy) Level 3: Full account, using physical science ideas, including both the matter and energy accounts—oxygen is combined in a chemical reaction with food or glucose that includes a conversion of the stored energy in food to forms usable by the cells SOURCE: Adapted from Krajcik et al. (2013). Rubrics can be simple, this one on respiration. Seems to be only for DCI, not science practice of explanation with evidence.

19 TABLE 3-2 Rubric for Assessment of Genetic Processes
Level Description Example Student Response Random mutations Student describes one or more of the random genetic mechanisms by which new traits arise. A species changes over time because of random mutations and gene shuffling. Random mutations can cause a change in a species' gene pool. And gene shuffling is different combinations of genes that come from the parents. If species are separated long enough, the species gene pool changes. Environment causes change with genetic basis Changes occur as a result of genetic mutations in direct response to the environment and/or not random. Animals mutate to fit in with their natural surroundings. So becoming darker helps to keep them in camouflage. Unclear or vague Student refers to mutations or random changes leading to new traits but does not describe a mechanism for how that happens. If a mutation happens it can [affect] the whole species by creating a variety of differences, from color change to more or less help against gathering food and protecting against predators. Trait not present Description of differences in traits not given at genetic level or denial of change in genes. I picked my answer because none of the [others] seemed all the way correct. Another rubric for DCI (mutations) Levels of performance are qualitatively different SOURCE: Furtak and Heredia (2014, p ) Reprinted with permission from John Wiley & Sons, Inc.

20 Resources Stanford SNAP Project on Assessment
Rubrics for Science Practices Seeing Students Learn Science (National Academy Press)

21 Gallery walk Tape up assessment docs on wall. Options to write:
Is there potential connection in my assessment to what we learned today? What I’d like to improve on What resources or advice or examples I need to move forward

22 Notes for ourselves on what we learned, what is still puzzling us?

23 What is your next step? What have we learned?
What do we need to find out? Need small group planning time? Find more out about…


Download ppt "“ “ Assessment Where we’ll be by the end of our day—"

Similar presentations


Ads by Google