Presentation is loading. Please wait.

Presentation is loading. Please wait.

Visible Learning for Literacy

Similar presentations


Presentation on theme: "Visible Learning for Literacy"— Presentation transcript:

1 Visible Learning for Literacy
Doug Fisher Visible Learning for Literacy

2 Every student deserves a great teacher, not by chance, but by design.

3

4 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = Rank: /136 Number of meta-analyses: Number of studies: Number of participants: Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

5 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = n/a Rank: 136/136 Number of meta-analyses: 7 Number of studies: 207 Number of participants: 13,938 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

6 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = (low) Rank: 125/136 Number of meta-analyses: 2 Number of studies: 92 Number of participants: n/a Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

7 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Std. error = (low) Rank: 88/136 Number of meta-analyses: 5 Number of studies: 161 Number of effects: 295 Number of participants: 105,282 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

8 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = (high) Rank: 58/136 Number of meta-analyses: 8 Number of studies: 674 Number of participants: n/a Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

9 This is the hinge point –
a year’s worth of growth for a year in school.

10 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Std. error = (low) Rank: 88/136 Number of meta-analyses: 5 Number of studies: 161 Number of effects: 295 Number of participants: 105,282 Mobility: d = -.34 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

11 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = n/a Rank: 136/136 Number of meta-analyses: 7 Number of studies: 207 Number of participants: 13,938 Retention: d = Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

12 Ability Grouping/Tracking: d = 0.12
0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = 0.045 Rank: 112/136 Number of meta-analyses: 14 Number of studies: 500 Number of participants: 1,369 Ability Grouping/Tracking: d = 0.12 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

13 Teaching test taking: d = .22
0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Std. error = (low) Rank: 88/136 Number of meta-analyses: 5 Number of studies: 161 Number of effects: 295 Number of participants: 105,282 Teaching test taking: d = .22 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

14 Small group learning: d = 0.49
0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = n/a Rank: 48/136 Number of meta-analyses: 2 Number of studies: 78 Number of participants: 3,472 Small group learning: d = 0.49 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

15 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = n/a Rank: 48/136 Number of meta-analyses: 2 Number of studies: 78 Number of participants: 3,472 Study Skills: d = 0.59 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

16 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = Rank: Number of meta-analyses: Number of studies: Number of participants: 5,028 Repeated Reading: d = 0.67 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

17 Classroom Discussion: d = 0.82
0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = Rank: Number of meta-analyses: Number of studies: Number of participants: 677 Classroom Discussion: d = 0.82 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

18 Collective Teacher Efficacy: d = 1.57
0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = Rank: Number of meta-analyses: Number of studies: Number of participants: 677 Collective Teacher Efficacy: d = 1.57 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

19 Every student deserves a great teacher, not by chance, but by design.

20 2. Cultivating the Learning Climate
1. Planning with Purpose 2. Cultivating the Learning Climate 3. Instructing with Intention 4. Assessing with a System 5. Impacting Student Learning

21

22

23 Learning Progressions Evidence of Learning Meaningful Learning
Transfer Links Lesson-specific Content Language Evidence of Learning Success criteria Evidence collection Meaningful Learning Aligned Differentiated

24 Teachers know what students need to learn
Teachers communicate learning intentions to students Teachers and students understand success criteria

25 0.5 0.4 0.6 0.3 0.7 0.2 Medium 0.8 0.1 High 0.9 Low 0.0 1.0 Teacher effects 1.1 -0.1 Developmental effects Negative 1.2 -0.2 Reverse effects Zone of desired effects Standard error = (Medium) Rank: 3/136 Number of meta-analyses: 2 Number of studies: 30 Number of participants: 3835 Teacher Clarity: d = 0.75 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. New York: Routledge.

26 The established purpose
focuses on student learning, rather than an activity, assignment, or task.

27

28 Three Questions What am I learning today? Why am I learning this?
How will I know that I have learned it? p. 27

29 Teachers know what students need to learn
Teachers communicate learning intentions to students Teachers and students understand success criteria

30 Sara explained the writing rubric, used reasoning to argue her status, and conveyed a set of experiences about writers at each level.

31 p. 31

32

33 Collaborative Learning
Clear learning intentions Relevant Learning intentions Accurate representation Focused Instruction Noticing Scaffolding Prompting, Cueing, and Questioning Guided Instruction Collaborative Learning Routines Task complexity Language support

34 TEACHER RESPONSIBILITY
“I do it” Focused Instruction Guided Instruction “We do it” “You do it together” Collaborative “You do it alone” Independent STUDENT RESPONSIBILITY

35 Transfer Deep Surface p. 20

36 Surface Skill and Concept Development p. 20

37 Connections, relationships and schema to organize skills and concepts
Deep Surface Skill and Concept Development p. 20

38 Transfer Self-regulation to continue learning skills and content, applying knowledge to novel situations Connections, relationships and schema to organize skills and concepts Deep Surface Skill and Concept Development p. 20

39 What Works When

40 Surface Learning is IMPORTANT

41 Ways to Facilitate Surface Learning
Leveraging prior knowledge (d=0.65) Vocabulary techniques (sorts, word cards, etc.) (d=0.67) Reading Comprehension Instruction (d=0.60) Wide reading on the topic under study (d=0.42) Summarizing (d=0.63)

42 Reading Volume Still Matters

43 STUDENT A 20 MINUTES PER DAY 1,800,000 WORDS PER YEAR
SCORES IN THE 90TH PERCENTILE ON STANDARDIZED TESTS

44 STUDENT B 5 MINUTES PER DAY 282,000 WORDS PER YEAR
SCORES IN THE 50TH PERCENTILE ON STANDARDIZED TESTS

45 STUDENT C 1 MINUTE PER DAY 8,000 WORDS PER YEAR
SCORES IN THE 10TH PERCENTILE ON STANDARDIZED TESTS

46 Surface Skill and Concept Development

47 Deep Learning is Also Important

48 Ways to Facilitate Deep Learning
Concept mapping (d=0.60) Class Discussion (d=0.82) Questioning (d=0.48) Metacognitive strategies (d=0.69) Reciprocal teaching (d=0.74)

49 Deep learning approaches don’t work any better at developing surface learning than surface learning strategies work to develop deep understanding.

50 Discussion Roundtable
1 My notes What ___ said 3 What ___ said 4 5 Independent Summary Deep Acquisition and Deep Consolidation p

51 Graphic organizers and concept maps are an intermediate step to something else —discussion or writing.

52 Without more complex tasks, students will not deepen their learning.

53 Task complexity should align with
the phase of learning.

54 Difficulty v. Complexity
A measure of effort required to complete a task. In assessment, a function of how many people can complete the task correctly. A measure of the thinking, action, or knowledge that is needed to complete the task. In assessment, how many different ways can the task be accomplished.

55

56

57 Marc Umile is among a group of people fascinated with pi, a number that has been computed to more than a trillion decimal places. He has recited pi to 15,314 digits.

58 Fluency More Complex Expertise Easy Stamina Strategic Thinking Hard
Low Difficulty High Complexity High Difficulty High Complexity Easy Hard Low Difficulty Low Complexity High Difficulty Low Complexity Fluency Stamina Less Complex p. 25

59 Connections, relationships and schema to organize skills and concepts
Deep Surface Skill and Concept Development

60 Transfer

61 “The ability to transfer is arguably the long-term aim of all education. You truly understand and excel when you can take what you have learned in one way or context and use it in another, on your own.” McTighe & Wiggins, 2011

62 Ways to Facilitate Transfer
Reading across documents to conceptually organize (d=0.85) Formal discussion, including debates and Socratic seminars (d=0.82) Problem-solving teaching (d=0.61) Extended writing (d=0.43) Peer tutoring (d=.55)

63

64 Rules, routines, procedures Recordkeeping
Welcome Positive regard Physical environment Community building Growth Producing Agency and identity Academic risk taking Repairs harm Efficient Operations Rules, routines, procedures Recordkeeping

65

66 Assessment to… Support Learners Monitor Learning Inform Learning
Comprehensible Goal-setting Checks for understanding Error analysis Types of feedback Usefulness Needs-based instruction Assessment to…

67 Feed up: establishing purpose
Check for understanding: daily monitoring of learning Feed back: providing students with information about their success and needs Feed forward: using student performance for “next steps” instruction and feeding this into an instructional model Fisher & Frey, 2009

68 Feed forward Where to next?

69 Feeding forward involves… Misconception analysis Error analysis
Error coding

70

71 2. Cultivating the Learning Climate
1. Planning with Purpose 2. Cultivating the Learning Climate 3. Instructing with Intention 4. Assessing with a System 5. Impacting Student Learning

72


Download ppt "Visible Learning for Literacy"

Similar presentations


Ads by Google