Presentation on theme: "Understanding Understanding: Tools for Assessing and Conceptualizing Reading Comprehension Presentation at the International Reading Association Annual."— Presentation transcript:
Understanding Understanding: Tools for Assessing and Conceptualizing Reading Comprehension Presentation at the International Reading Association Annual Conference May, 2009 Nell K. Duke, Michigan State University Literacy Achievement Research Center Ellin O. Keene, Author and Consultant
Plan for Presentation Nell: About comprehension and comprehension assessment Ellin: About the Observation Record for the Dimensions and Outcomes of Understanding Nell: About the COCA and ISCA
Reading Comprehension “... the process of simultaneously extracting and constructing meaning through interaction and involvement with written language” (RAND Reading Study Group, 2002)
Reader ActivityText Adapted from the RAND Reading Study Group, 2002 Context Reading Comprehension
Some Things That Probably Affect Reading Comprehension Word recognition and decoding Fluency Vocabulary knowledge Vocabulary strategies Syntactic knowledge Morphological awareness Written language register knowledge Genre knowledge
Some Things That Probably Affect Reading Comprehension Text structure knowledge Short-term and/or working memory Strategy use (but...) Knowledge and appropriate application of knowledge Reading engagement Reading volume Other factors
Reading Comprehension Assessment All of this makes reading comprehension assessment very challenging. In the following slides, I will discuss some existing tools for assessing reading comprehension in the primary grades. Like any assessment tool, each has strengths and weaknesses.
Think alouds or verbal protocols Can be used with a variety of texts Can provide insights into what readers are and are not thinking while they are reading (e.g., level of activity, strategy use, connections) A skill in themselves Suitability for children younger than second grade not well established Procedures, reliability, etc. not entirely established Some Existing Comprehension Assessment Tools for the Primary Grades
Informal reading inventories (passages & questions) Can estimate a child’s comprehension level; some have expository text passages Can indicate whether children have relative strengths versus weaknesses in literal versus inferential questions Procedures established but leveling often questionable and reliability usually not established Self-evaluation checklists (e.g., When I read I...) Provide insights into children’s view of their comprehension Depends on children’s judgments Validity and reliability not established Some Existing Comprehension Assessment Tools for the Primary Grades
Informed observation Requires knowledge of, among other things, the many different facets of and influences on comprehension Should include not only how children are comprehending, but also how they are responding to instruction Summaries or retellings Can provide insights into what children remember and view as important and their structural knowledge Are after reading and recall oriented A skill in itself Procedures, reliability, etc., often not established
Norm-Referenced Tests of Comprehension Usually do not distinguish informational comprehension from comprehension of other text types Usually dependent on word recognition Designed to place students on a normal curve Generally not very diagnostic Procedures and reliability established Comprehension of Picture Books Assessment (Paris & Paris, 2003) Is not dependent on word recognition Reliability and validity established Uses narrative text Others Some Existing Comprehension Assessment Tools for the Primary Grades
Is the Current Menu of Reading Comprehension Assessments Adequate? Nell: No. We need more assessments that are valid and reliable AND can inform instruction for teachers – showing strengths and areas of need that teachers can do something about, measuring processes, not just products of comprehension. Ellin: No. Most assessments are focused on retelling and answering comprehension questions – is there a better and more useful way to describe understanding?
Some Questions We Hear How do we know when children understand a text or concept? If they’re using comprehension strategies, are they comprehending? If we measure comprehension strategy usage, are we measuring comprehension? Where do we want comprehension strategies to lead?
Assessment Authors and Funding Credits COCA Authors: Alison K. Billman*, Nell K. Duke*, Katherine R. Hilden*, Shenglan Zhang, Kathryn Roberts, Juliet L. Halladay, Nicole M. Martin, Angela M. Schaal *These authors contributed equally. ISCA Authors: Katherine R. Hilden*, Nell K. Duke*, Alison K. Billman*, Shenglan Zhang, Juliet L. Halladay, Angela M. Schaal, Kathryn Roberts, Nicole M. Martin *These authors contributed equally. This work was supported by the Carnegie Corporation of New York and the Literacy Achievement Research Center at Michigan State University.
In the Primary Grades We Need Assessments: That have features we’ve talked about –Valid and reliable –Measures Processes of comprehension –Informs instruction –Useful in Research AND That aren’t dependent on word recognition and decoding skill That aren’t just traditional listening comprehension assessments That can flag students at risk for later reading comprehension difficulties
About One of the Assessments We Have Been Developing The Concepts of Comprehension Assessment (COCA) Inspired in part by the Concepts of Print Test (Clay) Designed for first, second, and maybe third graders Individually administered Assessment sessions run about 15 minutes / child Text is read aloud with book in front of the child Child can move the text Questions and prompts are scripted to facilitate consistent administration Student responses are recorded directly on the score sheet to facilitate scoring The COCA is copyrighted by the Michigan State University Board of Trustees.
THE COCA The COCA is available to download free from msularc.org [click on “Who We Are,” then “Principal Investigators,” then “Nell K. Duke,” then “LARC Projects” then the “Informational Comprehension Assessment in the Primary Grades: The Concepts of Comprehension Assessment (COCA)™”]. Excerpts of the COCA were shown during the presentation but are not in the handout.
More About the COCA The COCA assesses four hypothesized dimensions of comprehension of informational text: –Comprehension Strategy Use (CS) –Knowledge of Informational Text Features (TF) –Comprehension of Graphics in the Context of Text (GCT) –Vocabulary knowledge of high utility science words and Vocabulary strategies for rarer words (V)
More About the COCA There are, of course, many more than four dimensions. We selected these four primarily because they are all amenable to instruction. That is, a teacher could do something to improve a student’s knowledge or skills in this area (as contrasted, for example, with working memory capacity, which contributes to comprehension but does not appear particularly amenable to instruction). We also avoided some dimensions to comprehension that are already widely assessed and instructed (e.g., word recognition skill).
Comprehension Strategy Use Items for this construct are intended to measure whether the child is engaging in the kinds of thought processes used by good readers when they read (excluding vocabulary strategies, addressed in another construct). In particular: –Are they making predictions about the text? –Are they activating and/or making connections to prior knowledge? –Are they making inferences as needed to understand the text? –Are they summarizing, or comprehending in such a way that they can summarize when asked, as they read? Of course, good readers engage in many more thought processes than these, but we could not measure all of them and these four certainly seemed important and measurable.
Why Comprehension Strategy Use? Good readers engage in certain kinds of thought processes when reading (Pressley & Afflerbach, 1995; Duke & Pearson, 2002). There are differences in good and poor readers with respect to strategy use and degree or quality of strategy use appears to be related to comprehension achievement (e.g., Duke, Pressley, & Hilden, 2004). The relationship between comprehension strategy use and comprehension achievement seems to be causal in that instruction in comprehension strategies has been shown to improve comprehension (e.g., Duke & Pearson, 2002; National Reading Panel, 2000). We included inferencing in this area although that is often spontaneous rather than strategic in the traditional sense.
Knowledge of Informational Text Features Items for this construct are intended to measure whether the child has knowledge of some common features of informational text, including: –Table of Contents –Index –Glossary –Diagrams –Labels –Pronunciation Guide
Why Knowledge of Informational Text Features? There are differences in good and poor readers with respect to knowledge of some text features: text structure, at least (e.g., Dickson, Simmons, & Kame’enui, 1995). There is some causation at work in that text structure instruction can improve comprehension (e.g., Dickson, Simmons, & Kame’enui, 1995), as can instruction in searching (using index, headings, etc.) (Symons, MacLatchy-Gaudet, Stone, & Reynolds, 2001). Research has not yet established differences in good and poor readers, or causation, for a number of other informational text features.
Comprehension of Graphics in the Context of Text Items for this construct are intended to measure the child’s understanding of graphics or illustrations within a text, particularly as they relate to the written text. –Can the child integrate information provided by the text and illustrations? –Can the child use information from the written text to help them understand what is depicted in the illustrations? –Can the child derive information from and understand conventions within diagrams, flow charts (e.g., life cycles) and maps -- graphical devices common to informational text?
Why Comprehension of Graphics in the Context of Text? It appears that illustrations can have a facilitative effect on comprehension for at least some readers, although this does not seem to divide neatly along lines of good versus poor readers (see Gyselinck & Tardieu, 1999, for a review). To our knowledge, it has not yet been shown whether informational text comprehension can be improved by instruction in building meaning through illustrations as well as text. For now, we are assuming that the ability to comprehend graphics in the context of text is important and amenable to instruction.
Vocabulary, Strategy Items Items for this construct are intended to measure children’s skills related to learning new words in/from text. –Does the child recognize it if/when the text defines a word or provides other clues to a word’s meaning? (For example, in this sentence-- “When salmon are one year old they eat small fish called minnows.” --minnows is defined by the words small fish.) –If the illustration or a graphical device such as a diagram provides a clue to the word’s meaning, do they use that to approximate the word’s meaning? –Understanding that glossaries are tools that provide definition of words These items are not intended to measure whether children already come to the text knowing a word but rather whether they can figure it out from the text.
Why Vocabulary Strategies? Readers do learn word meanings from context, and higher ability readers are better at doing this (Swanborn and de Glopper, 1999). However, it is increasingly clear that teaching students to use contextual cues to ascertain word meaning and/or providing practice in that improves vocabulary (Fukkink & de Glopper, 1998; Kuhn & Stahl, 1998). There is little empirical evidence that this instruction translates into improved reading comprehension (see Bauman, in press, for a discussion). However, there is reason to think it might over a longer period of time and with more substantial intervention than has thus far been studied.
Vocabulary, Knowledge Items Items for this construct are intended to measure children’s knowledge of some words commonly used in informational text. Words for this construct are not specific to one particular topic of study but rather are found across the discipline or even many disciplines, words like examine, observe, kinds, and so on. As measurement terms, that is, terms telling specific lengths, times, weights, etc., are common in informational text, knowledge of these is also assessed (e.g., months, years, inches). Unlike the vocabulary strategies items, in these items we are trying to assess whether the child comes to the text already knowing that word and expecting it might be found in informational text. In this construct we are not trying to assess whether the child can figure out the meaning of the word from the context.
Why Vocabulary Knowledge? There is a strong relationship between vocabulary knowledge and comprehension achievement (Blachowicz & Fisher, 2000). The relationship seems to be causal in that instruction in vocabulary has been shown to improve comprehension (e.g., Baumann, Edwards, Boland, Olejnick, & Kame’enui, 2003). We focus on knowledge of high-utility vocabulary as we view these words as important to informational text comprehension and knowledge of them, unlike more topic-specific words such as alimentary or sedimentary, to be assumed by many informational texts, even for young readers (e.g., Hiebert, 2005).
The Informational Strategic Cloze Assessment (ISCA) Designed for first, second, and third graders Individually administered Assessment sessions run about 10 minutes / child Text is read aloud with book in front of the child Text is present in the books Child can move the text Questions and prompts are scripted to facilitate consistent administration Student responses are recorded directly on the score sheet to facilitate scoring The ISCA is copyrighted by the Michigan State University Board of Trustees.
THE ISCA The ISCA is available to download free from msularc.org [click on “Who We Are,” then “Principal Investigators,” then “Nell K. Duke,” then “LARC Projects” then the “Informational Comprehension Assessment in the Primary Grades: The Informational Strategic Cloze Assessment (ISCA)™”].
ISCA Dimensions The ISCA assesses the same four hypothesized dimensions of comprehension of informational text: –Comprehension Strategy Use (CS) –Knowledge of Informational Text Features (TF) –Comprehension of Graphics in the Context of Text (GCT) –Vocabulary knowledge of high utility science words and Vocabulary strategies for rarer words (V) Time to play “Guess the ISCA Dimension”
COCA & ISCA Psychometrics Confirmatory factor analysis confirms the four hypothesized dimensions Moderate concurrent validity with Gates-McGinitie No ceiling or floor effects for the target age group Adequate reliability of the composite
Our mission is to develop complex literacies across the lifespan through multi-disciplinary research.
Ellin O. Keene Ellin O. Keene began speaking here...
Understanding understanding Observations of students in the process of understanding including the aha! moments and the more protracted processes involved in understanding texts and concepts Patterns emerged – Outcomes and Dimensions of Understanding (Keene, 2008) We discussed the patterns with students We observed increased “understanding” behaviors
Informal Assessment 2 possibilities for informal comprehension assessment –Assessing comprehension Thinking Strategies (Shell, 2006) – assessing comprehension strategy use – more formal tool (will discuss at another session) –Observation Record for the outcomes and dimensions of understanding (my focus today)
Observation Record The Outcomes and Dimensions of Understanding Designed to promote more in-depth thinking about text and ideas more frequently – condenses the outcomes and dimensions of understanding Can be used by students and teachers Can be used together (dimensions and outcomes) or separately Is meant to capture comprehension thinking (outcomes) and behaviors (dimensions) over time
Observation Record The Outcomes and Dimensions of Understanding Teachers can keep an Observation Record for each child in a notebook – refer to/add to the record during conferences, observations Students can keep an Observation Record in their Reader’s Notebook, adding to it as they notice their use of outcomes and/or dimensions – it is not meant to be “filled out”
Observation Record The Outcomes and Dimensions of Understanding Teachers can look for patterns in outcomes and dimensions and teach toward less- frequently used thinking and behavior Students can use it as a tool to prepare for and reflect upon student-led conversations such as book clubs and literature circles
Observation Record The Outcomes and Dimensions of Understanding Students and teachers can use it to contribute to large and small group discussions and shared thinking records such as anchor charts.
Observation Record The Outcomes and Dimensions of Understanding – sample items – Outcomes (in my mind and heart) I find that I want to stop and think about this book or ideas -- there was just so much to think about that I couldn’t just fly through it quickly. I wanted to stop and think about it – I hoped that other people wouldn’t interrupt me – I felt like I learned so much more when I stopped to think about it or went back to reread it. Dwelling in ideas
Observation Record The Outcomes and Dimensions of Understanding – sample items – Outcomes (in my mind and heart) As I read this book, I began to realize that what I know, believe and feel really helped me understand this book. It made me feel smart to realize that I was right about things I know or that other people have similar feelings and beliefs. Recognizing the influence of knowledge, beliefs, values and emotions
Observation Record The Outcomes and Dimensions of Understanding – sample items – Dimensions (in my life) I just couldn’t stop reading this book or thinking about this idea. It was almost like I didn’t want to do anything else even though there were other interesting things going on. Sustaining interest, focusing, reading fervently, dwelling in an idea
Observation Record The Outcomes and Dimensions of Understanding – sample items – Dimensions (in my life) I know that this book or idea is hard for me, but it’s worth it – I am excited to think about how much I’m learning and how hard I’m willing to work to learn. It makes me want to take on more challenges. Embracing struggle
Pause and Ponder How do you define and describe expectations for students in comprehension? Are your classroom-based assessment practices congruent with your definition and expectations for students’ comprehension?