Doug Fisher www.fisherandfrey.com Engagement by Design.

Slides:



Advertisements
Similar presentations
Nancy Frey and Doug Fisher San Diego State University
Advertisements

When Students Can’t Read…
Economic Education and How People Learn Scott Simkins, Interim Director Academy for Teaching and Learning (ATL) North Carolina A&T State University Acknowledgements:
Developing Higher Order Thinking
ESP410 Human Movement Pedagogy 3
Classroom Instruction That Works Providing Feedback.
Gradual Release of Responsibility & Feedback
Metacognition Helping students to self-regulate. Definitions  Metacognition - literally “beyond knowing”, knowing what one knows and doesn’t know - promoting.
ED 3501: Curriculum and Instruction Section GHI - Fall Understanding by Design Understanding and Creating Effective Instructional Design.
1 Journals and Learning Logs. 2 Rationale Journals/ Learning Logs  Every time a student writes, instruction is individualized.  Writing makes it harder.
What does the Research Say About... POP QUIZ!!!. The Rules You will be asked to put different educational practices in order from most effective to least.
Planning, Instruction, and Technology
Gerry Sozio St Mary Star of the Sea College Wollongong
Strategies for Efficient Scoring Jeanne Stone, UC Irvine PACT Conference October 22, 2009.
Food For Thought “… twenty-first-century citizens need mathematics. But the mathematics that people need is not the sort of math learned in most classrooms.
The difference between learning goals and activities
A Framework for Inquiry-Based Instruction through
District Workforce Module Preview This PowerPoint provides a sample of the District Workforce Module PowerPoint. The actual Overview PowerPoint is 62 slides.
CFN 204 · Diane Foley · Network Leader CMP3 Professional Development Presented by: Simi Minhas Math Achievement Coach CFN204 1.
Strengthening Student Outcomes in Small Schools There’s been enough research done to know what to do – now we have to start doing it! Douglas Reeves.
I DENTIFYING THE LEARNING REQUIREMENTS Support units: Assessment in geography Illustration 1: Curriculum-based program planning and assessment.
Doug Fisher Follow me: dfisherSDSU.
Impact of Instructional Strategies
Teacher’s Institute September 14, MES Mission Statement “Dedicated to the education, support, and encouragement necessary for lifelong learning.
TEACHING WITH A FOCUS ON LEARNERS One model of Differentiation: Sousa and Tomlinson (2011) Differentiation and The Brain. Purpose: Understanding Text Complexity.
Connecting the Characteristics Margaret Heritage UCLA/ CRESST Attributes of Other Characteristics of Effective Instruction and Assessment for Learning.
Teachers that matter Effective teachers Gingerlee Lackey Graduate Student University of Alabama A presentation based on Chapter 3, “The argument: Visible.
 Teaching and learning are “VISIBLE”- that is, when it is clear what teachers are teaching and what students are learning, student achievement increases.
Teachers that matter Effective teachers Gingerlee Lackey Graduate Student University of Alabama A presentation based on chapter 3, “The argument: Visible.
IOWA ASCD Conference Impacting Learning in the Classroom December 4, 2013 Rebecca Martin Cedar Rapids Community
Aims Introduce evidence-based practice Develop an understanding of growth vs. fixed mindset Sharing / developing practice: how can we instill a growth.
What does the Research Say About . . .
Enhanced Lesson Design
COMMON CORE FOR THE NOT-SO-COMMON LEARNER
Visible Learning Plus: an introduction
Reducing Ineffective Practices
M-LANG project  Ref. n NO01-KA Interactive Exchange Workshop on how to use response systems and ICT tools for creating interactive learning.
Visible Learning for Literacy
Using Cognitive Science To Inform Instructional Design
OSEP Leadership Conference July 28, 2015 Margaret Heritage, WestEd
Writing in Math: Digging Deeper into Short Constructed Responses
Using the North Carolina Teacher Evaluation Rubric Proactively
What does the Research Say About . . .
I love portfolio! Nelly Zafeiriadou MA, EdD ELT School Advisor
Rich Problem Solving Tasks
Leona Group Retreat Tammy Gee May 29, 2013
David E. Gesner, MA, NREMT-P
Contemporary Issues November 8, 2010.
Transforming Grading Robert Marzano
How do grade levels currently plan at your school?
PRMSP/AlACiMa Project
Visible Learning for Literacy
Module 4 Challenge Engagement by Design.
Mapping it Out! Practical Tools to Use Assessment Well
Understanding by Design “Backwards Design”
Visible Learning for Literacy
Missouri Collaborative Work and Effective Teaching/Learning Practices
Teachers as “Activators of Learning” and “Evaluators of Impact”
Curriculum Implementation:
Building Understanding of the North Carolina Professional Teaching Standards How to READ the North Carolina Teacher Evaluation Rubric Using language and.
Doug Fisher Follow me: dfisherSDSU
VISIBLE LEARNING John Hattie.
Teaching as Inquiry 2017:  We are evaluators. We assess our impact.
How Do We Improve the Learning and Outcomes of our Students?
Marzano Art and Science Teaching Framework Learning Map
Thinking About Planning Amalia Lopez
Bellwork: Student Engagement Chart
Introduction to Teacher Clarity
Building Better Classes
Presentation transcript:

Doug Fisher www.fisherandfrey.com Engagement by Design

Every student deserves a great teacher, not by chance, but by design.

0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = Rank: /136 Number of meta-analyses: Number of studies: Number of participants: Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = n/a Rank: 136/136 Number of meta-analyses: 7 Number of studies: 207 Number of participants: 13,938 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = 0.016 (low) Rank: 125/136 Number of meta-analyses: 2 Number of studies: 92 Number of participants: n/a Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Std. error = 0.027 (low) Rank: 88/136 Number of meta-analyses: 5 Number of studies: 161 Number of effects: 295 Number of participants: 105,282 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = 0.081 (high) Rank: 58/136 Number of meta-analyses: 8 Number of studies: 674 Number of participants: n/a Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

This is the hinge point – a year’s worth of growth for a year in school.

0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Std. error = 0.027 (low) Rank: 88/136 Number of meta-analyses: 5 Number of studies: 161 Number of effects: 295 Number of participants: 105,282 Mobility: d = -.34 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = n/a Rank: 136/136 Number of meta-analyses: 7 Number of studies: 207 Number of participants: 13,938 Retention: d = - 0.13 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

Ability Grouping/Tracking: d = 0.12 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = 0.045 Rank: 112/136 Number of meta-analyses: 14 Number of studies: 500 Number of participants: 1,369 Ability Grouping/Tracking: d = 0.12 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

Small group learning: d = 0.49 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = n/a Rank: 48/136 Number of meta-analyses: 2 Number of studies: 78 Number of participants: 3,472 Small group learning: d = 0.49 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = n/a Rank: 48/136 Number of meta-analyses: 2 Number of studies: 78 Number of participants: 3,472 Study Skills: d = 0.59 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = Rank: Number of meta-analyses: Number of studies: Number of participants: 5,028 Repeated Reading: d = 0.67 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

Classroom Discussion: d = 0.82 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = Rank: Number of meta-analyses: Number of studies: Number of participants: 677 Classroom Discussion: d = 0.82 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

Collective Teacher Efficacy: d = 1.57 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = Rank: Number of meta-analyses: Number of studies: Number of participants: 677 Collective Teacher Efficacy: d = 1.57 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

Teacher-Student relationships: d = 0.72 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = Rank: Number of meta-analyses: Number of studies: Number of participants: 5,028 Teacher-Student relationships: d = 0.72 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = 0.079 (Medium) Rank: 3/136 Number of meta-analyses: 2 Number of studies: 30 Number of participants: 3835 Teacher Clarity: d = 0.75 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

Teachers know what students need to learn Teachers communicate learning intentions to students Teachers and students understand success criteria

The established purpose focuses on student learning, rather than an activity, assignment, or task.

Three Questions What am I learning today? Why am I learning this? How will I know that I have learned it?

Teachers know what students need to learn Teachers communicate learning intentions to students Teachers and students understand success criteria

Sara explained the writing rubric, used reasoning to argue her status, and conveyed a set of experiences about writers at each level.

“Goldilocks” Challenge: d = 0.74 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = 0.079 (Medium) Rank: 3/136 Number of meta-analyses: 2 Number of studies: 30 Number of participants: 3835 “Goldilocks” Challenge: d = 0.74

Transfer Deep Surface

Surface Skill and Concept Development

Connections, relationships and schema to organize skills and concepts Deep Surface Skill and Concept Development

Transfer Self-regulation to continue learning skills and content, applying knowledge to novel situations Connections, relationships and schema to organize skills and concepts Deep Surface Skill and Concept Development

What Works When

Surface Learning is IMPORTANT

Ways to Facilitate Surface Learning Leveraging prior knowledge (d=0.65)   Vocabulary techniques (sorts, word cards, etc.) (d=0.67) Reading Comprehension Instruction (d=0.60) Wide reading on the topic under study (d=0.42) Summarizing (d=0.63)

Reading Volume Still Matters

STUDENT A 20 MINUTES PER DAY 1,800,000 WORDS PER YEAR SCORES IN THE 90TH PERCENTILE ON STANDARDIZED TESTS

STUDENT B 5 MINUTES PER DAY 282,000 WORDS PER YEAR SCORES IN THE 50TH PERCENTILE ON STANDARDIZED TESTS

STUDENT C 1 MINUTE PER DAY 8,000 WORDS PER YEAR SCORES IN THE 10TH PERCENTILE ON STANDARDIZED TESTS

Deep Learning is Also Important

Ways to Facilitate Deep Learning Concept mapping (d=0.60)   Class Discussion (d=0.82) Questioning (d=0.48) Metacognitive strategies (d=0.69) Reciprocal teaching (d=0.74)

Deep learning approaches don’t work any better at developing surface learning than surface learning strategies work to develop deep understanding.

Without more complex tasks, students will not deepen their learning.

Task complexity should align with the phase of learning.

Difficulty v. Complexity A measure of effort required to complete a task. In assessment, a function of how many people can complete the task correctly. A measure of the thinking, action, or knowledge that is needed to complete the task. In assessment, how many different ways can the task be accomplished.

Fluency More Complex Struggle Easy Stamina Strategic Thinking Hard Low Difficulty High Complexity High Difficulty High Complexity Easy Hard Low Difficulty Low Complexity High Difficulty Low Complexity Fluency Stamina Less Complex

Transfer

“The ability to transfer is arguably the long-term aim of all education. You truly understand and excel when you can take what you have learned in one way or context and use it in another, on your own.” McTighe & Wiggins, 2011

Ways to Facilitate Transfer Reading across documents to conceptually organize (d=0.85)   Formal discussion, including debates and Socratic seminars (d=0.82) Problem-solving teaching (d=0.61) Extended writing (d=0.43) Peer tutoring (d=.55)

The right approach, at the right time, for the right type of learning.

LEARNING