Science Leadership Support Network March 23, 2009 Supported by PIMSER and Kentucky Department of Education Welcome! Help yourself to some refreshments.

Slides:



Advertisements
Similar presentations
Black & Wiliam (1998) Assessment in Education, p. 61 As an illustration of just how big these gains are, an effect size of.70, if it could be achieved.
Advertisements

Department of Mathematics and Science
Marzano Causal Model: A Framework for Teaching and Learning
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
Marzano Art and Science Teaching Framework Learning Map
Science Breakout New Teacher Meeting 6, Year 2 March 31, 2011.
Marzano Training May 24, 2013.
Engaged Learners: Current Research and Implications for Effective Instruction exists to strengthen Christian Schools and equip Christian Educators worldwide.
Preparing for the Data Team Process 1.  Know the rationale for “Step A” with respect to the data team process.  Experience Step A as a tool to help.
We accept learning as the fundamental purpose of our school and therefore are willing to examine all practices in light of the impact on learning.. - DuFour,
Consistency of Assessment
Principles of High Quality Assessment
Learning Goals, Scales and Learning Activities
Science Inquiry Minds-on Hands-on.
Writing In Science How to Scaffold Instruction to Support Learning New Teacher Year 2, Mtg 2 October 25, 2010 Becky Warf Smith.
FLCC knows a lot about assessment – J will send examples
Interactive Science Notebooks: Putting the Next Generation Practices into Action
Understanding by Design designed by Grant Wiggens and Jay McTighe.
Looking at Student work to Improve Learning
Who Am I? Hallie Booth – Special Education (K-12) – Science 6-8 (Gifted and Talented 6 th ) – Science Coach 6-12 – CTE LDC Coach 9-12 – Middle School LDC.
Day 3: Rubrics as an Assessment Tool. "There are only two good reasons to ask questions in class: to cause thinking and to provide information for the.
Marzano Instructional Strategies. Research-Based Instruction Robert Marzano, Debra Pickering, and Jane Pollock reviewed hundreds of studies on instructional.
Unbridled Learning Next Steps in the Content Leadership Networks.
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Writing Student Learning Outcomes Consider the course you teach.
Monitoring through Walk-Throughs Participants are expected to purpose the book: The Three-Minute Classroom Walk-Through: Changing School Supervisory.
AGENDA  A teacher’s perspective  Barb Schmidt Stevens High School  Acacia Trevillyan South Park Elementary  Review steps to create a quality CFA 
Deciding to enter into a quality process in education is not because good things are not happening but because of a desire to have good things happen regularly,
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
Integrating Differentiated Instruction & Understanding by Design: Connecting Content and Kids by Carol Ann Tomlinson and Jay McTighe.
Overview to Common Formative Assessments (CFAs) Adapted from The Leadership and Learning Center Presented by Jane Cook & Madeline Negron For Windham Public.
KEDC MEETING OCTOBER 21, 2010 KCAS and CASL: Framework for Deconstructing Standards 1.
Building Assessments with Differentiation in Mind Fonda Vadnais
Putting Research to Work in K-8 Science Classrooms Ready, Set, SCIENCE.
Student Growth in the Washington State Teacher Evaluation System Michelle Lewis Puget Sound ESD
David Steer Department of Geosciences The University of Akron Learning objectives and assessments May 2013.
Partnering to Progress K-5 Science Alliance May 7, 2008 Blue Licks State Park Welcome! Please help yourself to some refreshments and make sure you have.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Science Leadership Support Network February 13, 2009 Supported by PIMSER and Kentucky Department of Education Welcome! Help yourself to some refreshments.
Crysten Caviness Curriculum Management Specialist Birdville ISD.
Educator Effectiveness Academy Day 2, Session 1. Find Someone Who…. The purpose of this activity is to review concepts presented during day 1.
LeaPS Learning in Physical Science November 13, 2009 Supported by University of Kentucky PIMSER Math and Science Outreach Welcome!
LeaPS Learning in Physical Science December 15, 2009 Supported by University of Kentucky PIMSER Math and Science Outreach Welcome!
How People Learn – Brain, Mind, Experience, and School (Bransford, Brown, & Cocking, 1999) Three core principles 1: If their (students) initial understanding.
Facilitate Group Learning
Student Learning Objectives (SLO) Resources for Science 1.
Instructional Leadership Planning with Indicators of Quality Instruction.
Science Leadership Support Network September 25, 2009 Supported by PIMSER and Kentucky Department of Education Welcome!
Welcome! We’re glad to see you! Review the documents on the table and consider why education is important. Why do we care that all students learn? Who.
Network for New Science/Math Teachers December 10, 2009 Lexington, KY Brought to you by University of Kentucky Partnership Institute for Math & Science.
Marzano’s Essential 9 Instructional Strategies Engaged Time = Student Gains.
Marzano’s Teacher Evaluation Model Marzano is an educational researcher who has developed a teacher evaluation model that has been adopted by most of the.
Partnering to Progress K-5 October 23, 2007 Blue Licks State Park Welcome! Please help yourself to some refreshments and make sure you have signed in.
Instructional Leadership Planning Common Assessments.
1. Welcome 2. Working with the WIKI 3. Discussion of Assessment in curriculum development 4. Break 5. Divide into curricular areas – discuss: A.Standards.
Learning AP ILD November 8, 2012 Planning and Monitoring for Learning.
COMMON CORE STANDARDS C OLLEGE - AND C AREER - READINESS S TANDARDS North East Florida Educational ConsortiumFall 2011 F LORIDA ’ S P LAN FOR I MPLEMENTATION.
COMMON CORE STANDARDS C OLLEGE - AND C AREER - READINESS S TANDARDS North East Florida Educational ConsortiumFall 2011 F LORIDA ’ S P LAN FOR I MPLEMENTATION.
Partnering to Progress K-5 Science Alliance April 14, 2009 Blue Licks State Park Come on In! Please help yourself to some refreshments and make sure you.
"We know that true transformation in schools can only happen when there is a clear target that is known and owned by those who are implementing the goal.
RUBRICS AND SCALES 1. Rate yourself on what you already know about scales. Use the scale below to guide your reflection. 2.
Instructional Leadership Supporting Common Assessments.
1. Welcome 2. Working with the WIKI 3. Discussion of Assessment in curriculum development 4. Break 5. Divide into curricular areas – discuss: A.Standards.
LeaPS Learning in Physical Science
UbD: Goals for the Session
Science Leadership Support Network
Research and Theory, Introduction to the Scale or Rubric, Writing Measurement Topics, Score 3.0 and Sample Tasks Ravalli County Curriculum Consortium.
In-Service Teacher Training
Partnering to Progress
Transforming Grading Robert Marzano
Presentation transcript:

Science Leadership Support Network March 23, 2009 Supported by PIMSER and Kentucky Department of Education Welcome! Help yourself to some refreshments and enjoy some networking!

Goals of SLSN Deepen understanding of a balanced assessment system and its role in motivating students to higher levels of achievement. Understand and incorporate skills and strategies for transforming planning and practice in order to ensure that all students understand key concepts from the Earth and the Universe big idea. Develop and act on a personal vision of leadership for sustainable improvement in their school or district.

Group Norms Stay on schedule; be on time Put cell phones on silent Be respectful of all comments Participate actively Exercise the rule of “two feet” Come prepared for the meeting It’s OK to have FUN!

February Review Competing Priorities Deconstructing Standards Round Table Discussions Conceptual Change

Roadmap for Today Conceptual Change Grading and Reporting Deconstruction Review Round Table Discussions Competing Priorities

Grading and Reporting Learning Targets: –I can translate our standards into measurement topics. –I can sort elements of measurement topics from simple to complex.

Black & Wiliam (1998) Assessment in Education, p. 61 “As an illustration of just how big these gains are, an effect size of.70, if it could be achieved on a nationwide scale, would be equivalent to raising the mathematics attainment score of an ‘average’ country like England, New Zealand or the United States into the ‘top five’ after the Pacific rim countries of Singapore, Korea, Japan and Hong Kong” (Beaton et al, 1996)

%ile improvement increase Starting percentile 50th Starting percentile 50th Teacher assessment effectiveness Student Achievement Increase of 34%ile to 84%ile 13%ile increase to 63%ile

%ile improvement increase Starting percentile 50th Starting percentile 50th Teacher assessment effectiveness Student Achievement Increase of 49%ile to 99%ile 28%ile increase to 78%ile

John Hattie—reviewed 7,827 studies on learning and instruction. Conclusion… “The most powerful single innovation that enhances achievement is feedback. The simplest prescription for improving education must be ‘dollops’ of feedback.”

Like most things in education, classroom assessment enhances student achievement under certain conditions only. Feedback from classroom assessments should provide students with a clear picture of their progress on learning goals and how they might improve Feedback from classroom assessment should encourage students to improve. Classroom assessment should be formative in nature. Formative classroom assessments should be quite frequent.

Pretest 2/12 (48%) Quiz 2/15 (60%) Quiz 2/19 (60%)

 Identify one grade level (or course) learning goal per quarter or per semester for each of the following subject areas: mathematic, reading, writing, science, and social studies.  Construct a rubric, or other type of common scale, for each learning goal.  Have teachers formally and informally assess each learning goal at least once every two weeks keeping track of each student’s score on each learning goal. (Use of appropriate computer software is highly recommended)  Have students keep track of their progress on each goal and use the data as the basis for teacher/student interactions about student progress.  Periodically (at least, once per quarter) aggregate the data by grade level. Have teachers meet to discuss student progress and how it might be improved

Feedback from classroom assessments should provide students with a clear picture of their progress on learning goals and how they might improve # of studiesCharacteristic of Feedback from Classroom Assessment Percentile Gain/Loss Bangert-Drowns, Kulik, Kulik, & Morgan, Right/wrong-3 39 Provide correct answers Criteria understood by student vs. not understood 16 9 Explain20 4 Student reassessed until correct 20

Feedback from classroom assessments should provide students with a clear picture of their progress on learning goals and how they might improve # of studiesCharacteristic of Feedback from Classroom Assessment Percentile Gain/Loss Fuchs & Fuchs Displaying results graphically Evaluation by rule [uniform way of interpreting results of classroom assessments using a tight logic) 32 49Evaluation by rule [uniform way of interpreting results of classroom assessments using a tight logic) 32

C. Item Two items that asks for application in novel situations that go beyond what was explicitly taught Total for section= A. Items 1-10 Ten items that require recall of important but simpler content that was explicitly taught B. Items Four items that ask for application of complex content that was explicitly taught AND in situations similar to what was taught. Total /100

+ + Total for section= All correct Two correct None correct A. Items 1-10 Ten items that require recall of important but simpler content that was explicitly taught B. Items Four items that ask for application of complex content that was explicitly taught AND in situations similar to what was taught. C. Item Two items that asks for application in novel situations that go beyond what was explicitly taught Total /100 / / /

A generic template for rubric design

4 3The student’s responses demonstrate no major errors or omissions regarding any of the information and/or processes (THAT WERE EXPLICITLY TAUGHT) 2 1 0

4 3 The student’s responses demonstrate no major errors or omissions regarding any of the information and/or processes 2The student’s responses indicate major errors or omissions regarding the more complex ideas and processes; however they do not indicate major errors or omissions relative to the simpler details and processes 1 0

4 3 The student’s responses demonstrate no major errors or omissions regarding any of the information and/or processes 2 The student’s responses indicate major errors or omissions regarding the more complex ideas and processes; however they do not indicate major errors or omissions relative to the simpler details and processes 1The student provides responses that indicate a distinct lack of understanding of the knowledge. However, with help, the student demonstrates partial understanding of some of the knowledge. 0

4 3 The student’s responses demonstrate no major errors or omissions regarding any of the information and/or processes 2 The student’s responses indicate major errors or omissions regarding the more complex ideas and processes; however they do not indicate major errors or omissions relative to the simpler details and processes 1 The student provides responses that indicate a distinct lack of understanding of the knowledge. However, with help, the student demonstrates partial understanding of some of the knowledge. 0The student provides little or no response. Even with help the student does not exhibit a partial understanding of the knowledge.

4In addition to exhibiting level 3 performance, the student’s responses demonstrate in- depth inferences and applications that go beyond what was taught in class 3 The student’s responses demonstrate no major errors or omissions regarding any of the information and/or processes 2 The student’s responses indicate major errors or omissions regarding the more complex ideas and processes; however they do not indicate major errors or omissions relative to the simpler details and processes 1 The student provides responses that indicate a distinct lack of understanding of the knowledge. However, with help, the student demonstrates partial understanding of some of the knowledge. 0 The student provides little or no response. Even with help the student does not exhibit a partial understanding of the knowledge.

4 In addition to exhibiting level 3 performance, in-depth inferences and applications that go BEYOND what was taught in class. 3 No major errors or omissions regarding any of the information and/or processes (SIMPLE OR COMPLEX) that were explicitly taught 2 No major errors or omissions regarding the SIMPLER details and processes BUT major errors or omissions regarding the more complex ideas and processes 1 With HELP, a partial knowledge of some of the simpler and complex details and processes 0 Even with help, no understanding or skill demonstrated. Scale

Three Types of Items Level 2 items: Simpler details and processes that have been explicitly taught. Level 3 items: Complex ideas and processes that have been explicitly taught. Level 4 items: Inferences and applications that go beyond what was taught

Patterns of Responses Student answers L2 items correctly but not L3 and L4 items. Student answers L2 and L3 items correctly but not L4 Student misses all items, but with help can answer some correctly Students misses all items even when helped

Patterns of Responses Student answers L2 items correctly but not L3 and L4 items. (2.0) Student answers L2 and L3 items correctly but not L4 (3.0) Student misses all items, but with help can answer some correctly (1.0) Students misses all items even when helped (0.0)

The complete scale allows for half-point scores (3.5, 2.5, 1.5,.5)

4 In addition to exhibiting level 3 performance, in-depth inferences and applications that go BEYOND what was taught in class. 3 No major errors or omissions regarding any of the information and/or processes (SIMPLE OR COMPLEX) that were explicitly taught 2 No major errors or omissions regarding the SIMPLER details and processes BUT major errors or omissions regarding the more complex ideas and processes 1 With HELP, a partial knowledge of some of the simpler and complex details and processes 0 Even with help, no understanding or skill demonstrated. Scale

4 In addition to exhibiting level 3 performance, in-depth inferences and applications that go beyond what was taught in class. 3.5 In addition to exhibiting level 3 performance, partial success at in-depth inferences and applications that go beyond what was taught in class. 3 No major errors or omissions regarding any of the information and/or processes (SIMPLE OR COMPLEX) that were explicitly taught 2.5 No major errors or omissions regarding any of the simpler information and/or processes and partial knowledge of the more complex information and processes. 2 No major errors or omissions regarding the simpler details and processes BUT major errors or omissions regarding the more complex ideas and processes 1.5 Partial knowledge of the simpler details and processes, but major errors or omissions regarding the more complex ideas and processes. 1 With help, a partial knowledge of some of the simpler and complex details and processes..5 With help, a partial knowledge of some of the simpler details and processes but not of the more complex ideas and processes. 0 Even with help, no understanding or skill demonstrated. Scale

+ + All correct Two correct None correct A. Items 1-10 Level 2.0 Ten items that require recall of important but simpler content that was explicitly taught B. Items Level 3.0 Four items that ask for application of complex content that was explicitly taught AND in situations similar to what was taught. C. Item Level 4.0 Two items that asks for application in novel situations that go beyond what was explicitly taught Rubric Score:

1. Unpack the standards and benchmarks. 2. Identify measurement topics. 5. Use formative assessment as a means to collect evidence on student learning and to inform instructional practices. 4. Using a scale format, create rubrics for each grade level and/or course for each measurement topic. 3. Identify the elements for each grade level and/or course for each measurement topic.

TOPIC CONTENT STANDARD Benchmark Reporting Students’ Progress Too broad for feedback Too many, not feasible

1. Unpack the standards and benchmarks. 2. Identify measurement topics.

SPAN OF TOPICS TOPIC HS K All measurement topics may not span all grade levels

Earth and Space Sciences Atmospheric Processes and the Water Cycle Composition and Structure of the Earth Composition and Structure of the Universe and the Earth’s place in It Life Sciences Principles of Heredity and Related Concepts Structure and Function of Cells and Organisms Relationships Among Organisms and Their Physical Environment Biological Evolution and Diversity of Life Physical Sciences Structure and Properties of Matter Sources and Properties of Energy Forces and Motion Nature of Science Nature of Scientific Inquiry Scientific Enterprise Science

Is the topic broad enough to span several grade levels or is it limited to just one grade level? To what extent do you want to track this topic across grade levels over time? If the topic is too narrow, you will have difficulty creating a scope and sequence for the topic. Is that okay? If not, you may want to identify a topic that is more general. When you look across the topics for the subject area, have you been able to limit the number of topics to approximately 15? If not, you will need to determine if identifying more topics is moving you away from articulating a guaranteed and viable curriculum. Consider…………

TOPIC CONTENT STANDARD Reporting Students’ Progress LIFE SKILLS TOPIC Measurement topics need to include life skills (e.g., participation, work completion, behavior, working in groups).

1. Unpack the standards and benchmarks. 2. Identify measurement topics. 3. Identify the elements for each grade level and/or course for each measurement topic.

TOPIC Elements are identified through the process of unpacking the benchmarks for that standard. 8elements K CONTENT STANDARD Elements increase in level of complexity. The higher the grade level, the more complex the elements for that topic. The elements delineate what teachers are to address for that topic from one grade level/course to another.

TOPIC 8elements K HS Course elements HS Course elements HS Course elements CONTENT STANDARD elements TOPIC High School Course

TOPIC CONTENT STANDARD Elements within each topic must covary. That is, they must be related to each other and that as ability in one increases, ability in the other also increases. Covariance is partly a function of instruction. Elements create a scope and sequence or progression from one grade level/course to another. 8elements K

TOPIC CONTENT STANDARD 8elements K Complex elements Simple elements

TOPIC CONTENT STANDARD 8elements K complex element simple element Try to keep the number of elements for each grade level/course to four or less.

Consider………… Can you limit this topic to only 3-4 elements at each grade level/course for the simple and 3-4 elements for the complex. If you cannot, the topic may be too broad and you may need to create another (or several) smaller topics. An exception to this might be at the primary grade levels. The simple elements should be directly related to the complex elements. That is, these elements should represent knowledge that students will need to be proficient in the complex elements.

Consider………… You may have the simple content identified as the complex and need to rethink this or you may need to consider whether the simple elements needs to be the declarative knowledge related to the skill they will be using that is more complex.

How do you decide what is simple and what is complex? What guidelines would you use?

Considerations for A Scale Format for Measurement Topics Score 3.0 elements all begin with the stem, “while engaged in grade appropriate tasks, the student demonstrates an understanding of _ ____ _ by …” Score 3.0 elements will most likely be the reasoning and skill targets that have been identified during deconstruction. TOPIC

Considerations for A Scale Format for Measurement Topics Score 2.0 elements are derived from the score 3.0 elements and will most likely be our knowledge targets identified during deconstruction. –Basic terminology associated with score 3.0 elements –Basic or simple solutions for complex processes

Considerations for A Scale Format for Measurement Topics Score 4.0 elements address inferences and applications that go beyond what was explicitly taught. Marzano suggests the following cognitive processes can be used to design score 4.0 items and tasks: –Comparing –Classifying –Creating metaphors –Creating analogies –Analyzing errors

Considerations for A Scale Format for Measurement Topics No specifics have to be provided for in the scale for score values of 1.0 and 0.0, because they do not address new content. These score values (1.0 and 0.0) signify the extent to which students can demonstrate, with help, knowledge of content at score values 3.0 and 4.0.

1. Unpack the standards and benchmarks. 2. Identify measurement topics. 4. Using a scale format, create rubrics for each grade level and/or course for each measurement topic. 3. Identify the elements for each grade level and/or course for each measurement topic.

TOPIC 8 essential elements K CONTENT STANDARD Scale for Scoring Assessments complex element simple element

Scale for Scoring Assessments 4.0In addition to Score 3.0 performance, in-depth inferences and applications that go beyond what was taught. 3.0No major errors or omissions regarding any of the information and/or processes (simple or complex) that were explicitly taught. 2.0No major errors or omissions regarding the simpler details and processes but major errors or omissions regarding the more complex ideas and processes. 1.0With help, a partial understanding of some of the simpler details and processes but not the more complex ideas and processes. 0.0Even with help, no understanding or skill demonstrated.

4.0In addition to Score 3.0 performance, in-depth inferences and application that go beyond what was taught. 3.5In addition to 3.0 performance, partial success at inferences and applications that go beyond what was taught. 3.0No major errors or omissions regarding any of the information and/or processes (simple and complex) that were explicitly taught. 2.5No major errors or omissions regarding the simpler details and processes and partial knowledge of the more complex ideas and processes. 2.0No major errors or omissions regarding the simpler details and processes but major errors and omissions regarding the more complex ideas and processes. 1.5Partial knowledge of the simpler details and processes but major errors and omissions regarding the more complex ideas and processes. 1.0With help, a partial understanding of some of the simpler details and processes and some of the more complex ideas and processes. 0.5With help, a partial understanding of some of the simpler details and processes but not the more complex ideas and processes. 0.0Even with help, no understanding or skill demonstrated.

1. Unpack the standards and benchmarks. 2. Identify measurement topics. 5. Use formative assessment as a means to collect evidence on student learning and to inform instructional practices. 4. Using a scale format, create rubrics for each grade level and/or course for each measurement topic. 3. Identify the elements for each grade level and/or course for each measurement topic.

Standard Topics EcosystemsAdaptationScientific Inquiry Life Skill Assignments and Assessments Oct. 1 Oct 8 Students 5Oct 27 Ben Jamal Ashli

Can you construct an assessment that will ask students to demonstrate their proficiency in levels 2- 4? If not, you may need to revise your rubric or your topic or your elements. Consider………… Do the assessment items for levels 3 and 4 ask students to apply knowledge rather than just identify simple declarative and/or procedural knowledge?

Summary Topics should not be too specific or too general. There should not be more than topics per content area. Topics should support the articulation of a guaranteed and viable curriculum. Topics typically span several grade levels or courses. Topics are defined by “elements” which create a scope and sequence for the topic showing what should be taught from one grade level or course to another.

Topics should have no more than four elements for the simple level and no more than four elements for the complex level of the rubric. Elements should co-vary. Topics may be addressed several times throughout the year. Topics with elements can be translated into a rubric for scoring and reporting student achievement. The elements are reflected in the level 2 and level 3 or the rubric. Topics should provide a framework for developing and reporting out students’ progress using formative assessments. Summary

Developing Essential Elements for Processes That Shaped the Earth’s Surface Working with your grade band/level group, use your deconstruction of the targeted standards to determine score 3.0 elements and score 2.0 elements. Use the sample on pages in Making Standards Useful as a model. Draft a scoring scale for your grade band/level for the measurement topic processes that shaped the Earth’s surface. Examine the assessment items you drafted as your post-reading assignment from CAAGTW. Will they work as part of an assessment for the scale scores 4.0, 3.0 and 2.0?

Grading and Reporting Debrief Individually, list advantages and disadvantages to organizing the standards into measurement topics, and then developing scoring scales for each. Share with a partner. Write any questions you might have for discussion next month on an index card. D

Deconstruction Goals –To examine the deconstructed standards for congruency, clarity, and decontextualization. –To develop success criteria for selected targets.

Active Learning Through FA Take 2 minutes to review your reading and reading guide notes Ch. 7 –Open and closed learning objectives –Separating learning objectives from context Ch. 8 –Success criteria

Carousel Brainstorming Each table group will use a different colored marker to record ideas. Each table group will be given piece of chart paper with a question or a prompt on it and given time during which your group will generate and record responses to the question/prompt. When the time ends, a question/prompt from another group will be rotated to your group. Pass the marker to a new recorder at this time. Read the new question, read the previous responses, and either develop new ideas or expand on existing ideas as quickly as possible. When you get your original chart, summarize the responses.

Don’t Confuse These Two “C” Words Congruent –An exact match Correlated –Has some relationship

Congruent or Correlated? SC-4-EU-S-4: Students will describe and compare the processes, factors involved and consequences of slow changes to earth’s surface (e.g., erosion and weathering) SC-4-EU-S-5: Students will describe and compare contributing factors and consequences of fast changes to earth’s surface (e.g., landslides, earthquakes, floods) A.I can observe changes to the earth’s surface over time and use evidence/data to infer the cause of the change. B.I can create a model of a volcano to show a fast change to earth’s surface. C.I can name 3 places on earth where a fast change has occurred.

Congruent or Correlated? SC-7-EU-S-6: Students will investigate the forces and processes that change Earth’s surface or atmosphere and analyze data to generate predictions of their effects. A.I can identify cloud types from an illustration. B.I can find examples of erosion in my community. C.I can list processes that change the earth’s atmosphere.

Congruent or Correlated? SC-H-EU-S-2: Students will research the historical rise in acceptance of the theory of Plate Tectonics and the geological/biological consequences of plate movement A.I can create a model of magnetic sea floor striping. B.I can explain the difference between a theory and a fact. C.I can identify weaknesses in Wegner’s original continental drift idea.

“When you begin with well-defined learning targets, you are able to plan an assessment that reflects exactly what you will teach and what you expect students to learn. You will also be able to use assessments to further learning, by disaggregating the information on any assessment, learning target by learning target or standard by standard, to show areas of growth and areas needing further work.” –Rick Stiggins, Classroom Assessment for Student Learning, pg. 56

Review and Revise Examine the deconstruction from last month’s meeting. Consider the following: –Are the statements congruent? –Are the statements clear? –Are the statements decontextualized? Make any needed changes on a “master” copy to turn in.

Round Table Discussions Look over the schedule you were assigned and seek any needed clarifications before we begin. Please follow the schedule you were assigned in the order indicated on your schedule. Although there is not enough time to attend all stations, you will have the opportunity to share information with each other later. It’s time to find out how a few of your colleagues are implementing some of the grading and reporting strategies we are studying in SLSN.

Round Table Discussions When the music begins, move to the station you were assigned for the first round. Each time the music begins it is time to change rounds and move to the next station you were assigned. Please remember to write what you feel is an important thing to remember about the session on a Flower Card and leave it with the facilitator before you move to the next session.

Give One-Get Some You will need the handout titled “Important Things to Remember.” When the music begins again you will have 5 minutes to visit stations you were not assigned, view the “Important Things to Remember” found on the Flower Cards, and record them on your handout.

Competing Priorities

Homework Reflection Refer to your 4 column chart. Barriers to Change Column 1: Genuinely held commitment Column 2: What I do that works against my commitment Column 3: The competing commitment that generates column 2 Column 4: My big assumption What would you like to see changed at work, so that you could be more effective or so that work would be more satisfying? What commitment(s) does your complaint imply? What are you doing, or not doing, that is keeping your commitment from being more fully realized? If you imagine doing the opposite of the undermining behavior (column 2) do you detect in yourself any discomfort, worry, or fear? What worrisome outcome are you committed to preventing? What are you really trying to protect yourself from?

If we are certain we know how the world works—and this is how a Big Assumption operates; it creates certainty—why would we even think to look for a different reality?

Table Discussion 1.What did you learn about ‘complaints’? 2.Did this activity cause you to look at others’ complaints in a new/different way? Can you share an example? 3.We often hear leaders described as ‘hypocrites’. What are your thoughts on that? 4.Did you find that you uncovered any ‘Big Assumptions’ of your own? 5.Discuss this question: Which Big Assumptions do WE HOLD vs which Big Assumptions HOLD US?

Next Steps-Homework/Reflection 1.Read the “Big Assumptions” handout. 2.Think about the Big Assumption that you uncovered through your homework. 3.Plan a way to ‘test your assumption’ (use the steps outlined in the article) 4.Record your findings/evidence/reflections. Revealing a big assumption doesn’t necessarily mean it will be exposed as false. But even if a big assumption does contain an element of truth, an individual can often find more effective ways to operate once he or she has had a chance to challenge the assumption and its hold on his or her behavior.

Conceptual Change Goals –To revisit types of conceptual change for curricular and instructional implications. –To deepen understanding of processes that shape the earth K-12.

Elaborating on a Preexisting Concept C Example: Structure and Function Teeth diet claws hunting Water exists in 3 states; other matter exists in 3 states 2

Restructuring a Network of Concepts A B C Example Air & Matter Air is matter replaces air is nothing 3

Achieving New Levels of Explanation AB CC BA Example: Matter Atoms----Molecules Bio processes Element cycling through lithosphere Global climate 8

The Art and Science of Teaching Instruction Classroom Management Developing Effective Units Student Engagement High Expectations 1 Learning Goals Feedback The Art & Science of Teaching involves 10 “design questions” teachers can ask themselves as they plan a unit of instruction.

The Art & Science of Teaching Question 1 –What will I do to establish and communicate learning goals, track student progress, and celebrate success? Question 2 –What will I do to help students effectively interact with new knowledge? Question 3 –What will I do to help students practice and deepen their understanding of new knowledge? Question 4 –What will I do to help students generate and test hypotheses about new knowledge?

What Will I Do to Help Students Practice and Deepen Their Understanding of New Knowledge? Provide students with tasks that require them to examine similarities and differences. Help students identify errors in thinking. Provide opportunities for students to practice skills, strategies, and processes. Determine the extent to which cooperative groups will be used. Assign purposeful homework that involves appropriate participation from the home. Have students systematically revise and make corrections in their academic notebooks.

Goals for Earth Process Activities Experience a learning progression for a particular topic P-12. Experience various ways of deepening understanding of a topic at different grade bands. To consider instructional design implications.

January’s Learning Progression Primary –Identify local changes to the earth and tell what might have caused them. Intermediate –Compare and contrast quick change versus slow change. Middle –Determine the impact of destructive and constructive forces on the Earth’s surface. High –Predict the consequences of constructive and destructive forces to the Earth’s surface.

Learning Progression for Today Elementary –Compare slow versus fast changes/events and sequence the changes/events in order to establish an understanding of relative time. Middle –Identify the forces responsible for the creation of a variety of landforms in order to distinguish between constructive and destructive forces. High –Predict the consequences of constructive and destructive forces to the Earth’s surface.

March Madness Bracket You will be going through the 3 stations with a partner. Each station will accommodate several pairs. Your partner will be determined by the results of the ‘Bracket Draw’.

Station Work With your partner, begin at any station. Once there, follow the directions for that station and complete the task in the time allotted. When time ends, rotate to the next grade level. Be prepared to discuss all the stations when complete.

Station Reflection Describe the Learning Progression P-12 that you just experienced using a non- linguistic representation. How would you describe the effectiveness of each station at deepening student understanding? What might be some considerations for “next steps” in instruction?

Roadmap for Today Conceptual Change Grading and Reporting Deconstruction Review Round Table Discussions Competing Priorities

For Next Time Our next meeting is April 17 th. Read Ch. 10 in Active Learning Through Formative Assessment –Complete the reading guide Optional: read Ch. 4 and 5 in Classroom Assessment and Grading That Work