Presentation is loading. Please wait.

Presentation is loading. Please wait.

Instruction in Reading

Similar presentations


Presentation on theme: "Instruction in Reading"— Presentation transcript:

1 Instruction in Reading
Using DATA and RESEARCH INTO PRACTICE for Growth and Instruction

2 What is Reading? 2 “Reading is an active and complex process that involves Understanding written text Developing and interpreting meaning; and Using meaning as appropriate to type of text, purpose, and situation” (NAEP Framework, 2009) Reading is the single most important educational skill students will learn. As students move up in grade levels text demand significantly increases. As students move grade level to grade level the text demands increase. Demands increase in vocabulary as well as state and unstated information text evidence that requires closer reading of the text. Increasing demands are also placed on the reader due to the cognitive load of the questions.

3 Two important goals for improvement:
1. Increase the percentage of students reading “at grade level” each year at each grade level from kindergarten through tenth grade. 2. Decrease the percentage of students with serious reading difficulties each year at each grade level. Our most important measure of success in accomplishing these goals is assessing student performance in reading comprehension using an initial screening, mid-year assessment, and outcome measure at the end of each grade level. Goal: Increase the percentage of students that are reading on level and accelerating student reading achievement so that the student of concern will make annual growth plus catch up growth. Use multiple data points for placement into reading intervention                                                

4 In the last ten years in grades 6-8 we have increased the percent of students scoring FCAT Level 3 and above by 16% In the last ten years in grades 6-8 we have decreased the percent of students scoring FCAT Level 1 by 14% 4

5 Ten years ago we have more 9th and 10th graders scoring at FCAT Level 1 that we did FCAT Level 3 and above. In the last ten years in grades 9-10 we have increased the percent of students scoring FCAT Level 3 and above by 12% In the last ten years in grades 9-10 we have decreased the percent of students scoring FCAT Level 1 by 13% 5

6 A result of our strong statewide accountability and initial staff development opportunities on average each year the percent of students reading at a proficient level and above continues to increase 6

7 Raising Achievement and Closing Gaps Between Students

8 8 Text complexity is the key to accelerating student achievement in reading.

9 Text Complexity - ACT Study
Purpose: Determine what distinguished the reading performance of students likely to succeed in college and not. Process: Set benchmark score on the reading test shown to be predictive of success in college (“21” on ACT composite score) Looked at results from a half million students. Divided texts into three levels of complexity: uncomplicated, more challenging, and complex.

10 Performance on the ACT Reading Test by Comprehension Level
(Averaged across Seven Forms) In Reading: Between the Lines, ACT demonstrates that student performance cannot be differentiated in any meaningful way by question type. Students do not perform differently if they are answering literal recall items or inferential items.

11 Performance on the ACT Reading Test by Textual Element
(Averaged across Seven Forms) ACT demonstrates that student performance cannot be differentiated in any meaningful way by question type. Students do not perform differently if they are answering vocabulary items or main idea.

12 Text Complexity Matters
Performance on complex texts is the clearest differentiator in reading between students who are more likely to be ready for college and those who are less likely to be ready. Texts used in the ACT Reading Test reflect three degrees of complexity: uncomplicated, more challenging, and complex.

13 Performance on the ACT Reading Test by Degree of Text Complexity
(Averaged across Seven Forms) Test performance, according to ACT, is driven by text rather than questions. Thus, if students are asked to read a hard passage, they may only answer a few questions correctly, no matter what types of questions they may be. On the other hand, with an easy enough text, students may answer almost any questions right, again with no differences by question type. In this figure, performance on questions associated with uncomplicated and more challenging texts both above and below the ACT College Readiness Benchmark for Reading follows a pattern similar to those in the previous analyses. Improvement on each of the two kinds of questions is gradual and fairly uniform. 13 13

14 Recap of ACT Findings 14 Question type and level (main idea, word meanings, details) is NOT the chief differentiator between student scoring above and below the benchmark. The degree of text complexity in the passages acted as the “sorters” within ACT. The findings held true for both males and females, all racial groups and was steady regardless of family income level. What students could read, in terms of its complexity--rather than what they could do with what they read—is greatest predictor of success. FCAT has complex passages and highly cognitive demanding questions. The ACT report goes on to describe features that made some texts harder to understand, including the complexity of the relationships among characters and ideas, amount and sophistication of the information detailed in the text, how the information is organized, the author’s style and tone, the vocabulary, and the author purpose. ACT concluded that based on these data, “performance on complex texts is the clearest differentiator in reading between students who are likely to be ready for college and those who are not” FCAT is made up of complex passages and high cognitive demand questions.

15 Students who arrive behind in reading or close to grade level are often taught through courses that don’t demand much reading. Many students are engaged in shallow reading, skimming text for answers, focusing only on details and failing to make inferences in order to integrate different parts of the text. Years of reading in this superficial way will cause a student’s reading ability to deteriorate. For many students the decline of text demands in the courses that they take has both an immediate and long term impact on student achievement.

16 The Percent Of Students Who Have Previously Scored A Level 3 Or Higher On FCAT Reading
2011 FCAT Results Grade Of Students Scoring Level 1 on the FCAT Reading, the Percent who have previously scored a Level 3 or higher in Reading Of Students Scoring Level 2 on the FCAT Reading, the Percent who have previously scored a Level 3 or higher in Reading 4 21 53 5 29 67 6 36 76 7 31 72 8 43 85 9 46 87 10 58 90 The Percent Of Students Who Have Previously Scored A Level 3 Or Higher On FCAT Reading from 2011 FCAT Results 58 percent of the students who scored at Level 1 who are now in 10th grade have previously scored FCAT Level 3 or higher, 90% of 10th graders who scored at FCAT Level 2 have previously scored FCAT Level 3 or higher. We have (only) 6 percent of all students tested in 2011 who have never scored above FCAT Level 1. Only 6 .7% of students who do not pass FCAT pass an alternate graduation reading requirement. The key to increasing success in middle and high school reading achievement is high quality teaching and complex text. 16

17 What is FAIR? A K-2 assessment system administered to individual students 3 times a year, with electronic scoring, Adobe AIR version, and PMRN reports linked to instructional resources. A 3-12 computer-based system where students take the assessments 3 times a year. Several tasks are adaptive. PMRN reports are available, linked to instructional resources. Printed toolkit available. © 2011 Florida Department of Education

18 The K-2 “Big Picture” Map
Broad Screen/Progress Monitoring Tool (BS/PMT) “All” students Letter Naming & Sounds Phonemic Awareness Word Reading Broad Diagnostic Inventory (BDI) “Some” students for vocabulary Listening Comprehension Reading Comprehension Vocabulary Spelling (2nd grade only) Targeted Diagnostic Inventory (TDI) “Some” students; some tasks K = 9 tasks 1st = 8 tasks 2nd = 6 tasks Ongoing Progress Monitoring (OPM) “Some” students K – 2 = TDI tasks 1 – 2 = ORF Let’s review the various components of the K-2 assessment. Review Slide. Notes to clarify what “all”, and “some” mean with regard to who must take the Broad Screen and Broad Diagnostic: Progress monitoring must be reported three times per year for students who have been identified with a reading deficiency, based upon locally determined assessments, statewide assessments, or through teacher observations. Students identified with a reading deficiency must be given intensive reading instruction immediately following the identification of the reading deficiency. For elementary students not participating in the statewide reading assessment (FCAT), substantial deficiency in reading must be defined by the district school board. For students required to participate in the statewide assessment, a substantial deficiency in reading is defined by scoring Level 1 or Level 2 on the Florida Comprehensive Assessment Test (FCAT) in Reading. The Broad Screen must be administered three times per year to students who have been identified with a reading deficiency. We HIGHLY recommend that districts assess all K-2 students, but districts may opt to assess just those students who have been identified with a reading deficiency. The assessments may be delivered to other students that the district identifies. Thus ‘All’ means the students that are rostered on the PMRN. 18

19 K-2 Targeted Diagnostic Inventory (TDI) Map
Kindergarten Print Awareness Letter name and sound knowledge Phoneme Blending Phoneme Deletion Word Parts/Initial Letter Sound Connection Initial Letter Sound Connection Final Word Building –Initial Consonants Word Building –Final Consonants Word Building –Medial Vowels First Grade Letter Sound Knowledge Phoneme Deletion Initial Phoneme Deletion Final Word Building –Consonants Word Building –Vowels Word Building –CVC /CVCe Word Building –Blends Second Grade Word Building –Blends & Vowels Multisyllabic Word Reading The TDI is a set of tasks designed to more precisely indicate the areas of instructional need based upon performance on the Broad Screen. Here is a listing of the tasks by grade level.

20 The K – 2 “Score” Map BS/PMT PRS = Probability of Reading Success BDI
LC = Listening Comprehension Total questions correct (implicit/explicit) RC = Reading Comprehension Total questions correct (implicit/explicit), Fluency, Percent Accuracy Target Passage VOC = Vocabulary Percentile Rank SPL = Spelling TDI ME = Meets Expectations BE = Below Expectations OPM ORF = Adjusted Fluency OPM TDI Tasks = ME or BE and Raw Score When looking at the data provided by this assessment you will see several different types of scores. This table is a snapshot of the types of scores reported for each section of the assessment. You will see that the BDI contains the most score types. 20 20

21 Target RC Passages for Grades 1 and 2 (BDI)
Provide participants with the handout. This document provides some basic or broad guidelines for teachers to use with the Reading Comprehension task of the BDI. This chart provides teachers with target stories that can be used as a guideline to determine if a student is meeting developmental reading expectations for each assessment period. They received this during their training. During field studies, reading accuracy and comprehension data from a large, representative sample of Florida students were collected to establish these benchmarks or target stories. Data from the school year were also examined to make refinements. The goal is for students to read and understand the target story. How were Target Passages determined? Because of the huge variability in readability formulae on first and second grade text (due to the fact that “difficulty” is more a matter of within-word linguistic issues than word frequency of sentence length) we take an empirical route. In other words, we used our implementation study data to see what passage the majority of first and second graders read at each AP and designate that as the Target Passage. The goal is for students to read and understand the target story. This includes reading accurately (95 % of words read correctly), fluently (progressing toward meeting the end of the year wcpm target goals) and with understanding (answering at least 4 out of 5 comprehension questions). These guidelines were designated to support teachers in setting instructional reading goals for their students. This information, combined with teacher observation and classroom performance will help with instructional decision making for all students. It is important to note that students who are not successfully reading the target story for a designated assessment period may benefit from additional instructional support. Additional instruction can provide students with the knowledge and skills required to successfully read the target passage by the end of the year. Based on fluency norms from several research studies (Foorman, York, Santi, & Francis, 2008; Fuchs, Fuchs, Hosp, & Jenkins, 2001; Hasbrouck & Tindal, 2006), it is recommended that teachers use 60 wcpm as a target goal for the end of first grade and 90 wcpm as the target goal for the end of second grade. The passages in the BDI are primarily to measure comprehension and the OPM ORF passages, which provide the teacher with an adjusted fluency score (adjusted for passage difficulty), are the passages and scores that should be used to measure and monitor growth in fluency (wcpm). We will need to use these target passages when looking at the school status report. 21 Florida Center for Reading Research 21 21

22 Grade 1 PRS Chart ( )

23 Student Score Detail Box (K-2)
Excellent report to include in a student’s cumulative folder This is the full view of the Student Score Detail Box (SSDB) It lists the Grade and Year on the first line, the Student’s name and assessment period on the next line and then the student’s scores. In this box, is all of the same information presented on the main status report for the individual student and those scores not displayed on the main page. It includes all scores for the Broad Screen and BDI, but on this report comprehension questions are broken down by implicit/explicit. This is also where a teacher will find the raw scores for a student’s responses on the TDI. It can be printed from the pop-up box link (point to the print button at the bottom) or the teacher can print the detail box for each student in the whole class with one link from the class status report. This is also an excellent report to include in a student’s cumulative folder. ©2011 Florida Center for Reading Research 23

24 74% of students with a .85 PRS at the end of 2nd grade in SY 09-10
Just Read, Florida! along with staff from the Florida Center for Reading Research reviewed FAIR data for 2nd graders from school year who had a Probability of Reading Success (PRS) of We followed this cohort into grade 3 to see how they performed on FCAT Reading last school year, Results 74% of students with a .85 PRS at the end of 2nd grade in SY 09-10 scored FCAT Reading Level 3 or above in 3rd grade SY 15% of students with a .85 PRS at the end of 2nd grade in SY 09-10 scored FCAT Reading Level 1 in 3rd grade SY 1011. 11% of students who had .85 PRS at the end of 2nd grade in SY 09-10 scored FCAT Reading Level 2 in 3rd grade SY As we move into FCAT 2.0, the best way to reduce failures on Grade 3 FCAT is to target instruction earlier, in grades K-2. Waiting to address reading difficulties in grade 3 is too late. 24

25 Grades 3-12 Assessments Model
Broad Screen/Progress Monitoring Tool Reading Comprehension Task (3 Times a Year) Targeted Diagnostic Inventory Maze & Word Analysis Tasks Diagnostic Toolkit (As Needed) Ongoing Progress Monitoring If necessary This slide gives you a visual of the flow of the 3-12 WAM. You can see the four components of the assessment module that we just mentioned. Today we’ll take a much closer look at each of these four components. RC Screen Helps us identify students who may not be able to meet the grade level literacy standards at the end of the year as assessed by the FCAT without additional targeted literacy instruction.  Mazes Helps us determine whether a student has more fundamental problems in the area of text reading efficiency and low level reading comprehension.  Word Analysis Helps us learn more about a student's fundamental literacy skills--particularly those required to decode unfamiliar words and read accurately.  © 2011 Florida Department of Education

26 Purpose of Each 3-12 Assessment
RC Screen Helps us identify students who may not be able to meet the grade level literacy standards at the end of the year as assessed by the FCAT without additional targeted literacy instruction.  Mazes Helps us determine whether a student has more fundamental problems in the area of text reading efficiency and low level reading comprehension. Relevant for students below a 6th grade reading level.  Word Analysis Helps us learn more about a student's fundamental literacy skills--particularly those required to decode unfamiliar words and read and write accurately.  Let’s review the purpose of each type of assessment which we briefly discussed on the big picture map. 26

27 How is the student placed into the first passage/item?
Task Placement Rules Reading Comprehension - Adaptive The first passage the student receives is determined by: Grade level Maze – Not adaptive Two predetermined passages based on grade level and assessment period (AP). WA - Adaptive AP 1-3 starts with predetermined set of 5 words based on grade level. Student performance on this first set of 5 words determines the next words the student receives. 5-30 words given at each assessment period based on ability. Currently this estimate for Reading Comprehension is comprised of the student’s grade level and their prior year FCAT. If the student does not have a prior year’s FCAT, then the mean FCAT score for that school and that grade level is used instead. For AP 2 and 3, the first passage the student receives is based on the student’s prior FSP. If the child has not taken the RC Screen before then the logic for AP 1 is used. All students taking the Maze task receive two passages that are predetermined based on grade level and assessment period (AP). For example, all students in grade 7 at AP 2 would receive the same two passages. On the Word Analysis task, a predetermined set of 5 words is given based upon grade level at each assessment period. Based on how the student performs on these first five words will determine an estimate of ability. The student will then be given harder or easier words based on the estimate of ability until the data provides a reliable estimate of ability. The minimum number of words a student will receive is five and the maximum is thirty.

28 How is the student placed into subsequent passages?
Based on the difficulty of the questions the student answers correctly on the first passage, the student will then be given a harder or easier passage for their next passage. Difficulty of an item is determined using Item Response Theory (IRT). Because of this feature, the raw score of 7/9 for Student A and 7/9 for Student B, when reading the same passage, does not mean they will have the same converted scores.

29 The 3-12 “Big Picture” Map Type of Assessment Name of Assessment
Broad Screen/Progress Monitoring Tool (BS/PMT) – Appropriate for ‘All’ students Reading Comprehension (RC) Targeted Diagnostic Inventory (TDI) – “Some” students Maze Word Analysis (WA) Ongoing Progress Monitoring (OPM) – “Some” students ORF RC Informal Diagnostic Toolkit (Toolkit) – “Some” students Phonics Inventory Academic Word Inventory Lexiled Passages Scaffolded Discussion Templates Review the information to identify the components of the 3-12 assessment for administrators. Point out that the 3-12 TDI does not target specific skills like the TDI in K-2, but gives them a general direction and then they need to do a bit more digging with OPM and the informal toolkit to determine specific areas of need. Remind participants that the Broad Screen and TDI are progress monitoring tools administered three times a year and the OPM is progress monitoring for those students who require progress monitoring more often than the three times a year. The FLDOE has put the plan to have RC OPM available on hold due to limited funds for programming and demands on computational resources. The RC OPM mimics the Broad Screen – each month, the student would take a set of passages, one informational/expository and one narrative. ***Below is the explanation for why the slide says Appropriate for all students – but it is not required except for level 1 and 2 students. “Progress monitoring is required for all students scoring at Level 1 and 2 on the prior year’s FCAT Reading. The assessments may be delivered to other students that the district identifies. Thus ‘All’ means the students that are rostered on the PMRN” 29 29

30 The 3-12 “Score” Map Reading Comprehension - BS/PMT
FCAT Success Probability (FSP) Color- coded Percentile Standard Score Lexile® Ability Score and Ability Range FCAT Reporting Categories Maze - TDI Adjusted Maze Score Word Analysis - TDI Ability Score (WAAS) OPM RC – Ability Score, Ability Range, Reporting Categories Maze – Adjusted Maze Score ORF (3rd – 5th) Adjusted Fluency Score Here is a listing of the scores you will receive from the PMRN on each of the tasks. We will briefly discuss each one and what they mean. 30 30

31 Lexile® Measure Two types of Lexile measures Lexile reader measure
Represents a person’s reading ability on the Lexile scale (this is what you will see on your reports) Has nothing to do with the Lexile of the passage. Lexile text measure Indicates the reading demand of the text in terms of word frequency and sentence length. Range of uncapped Lexile Measures on FAIR: Range around Lexile Measure = -100 and +50 (e.g., 600L, 500 – 650L) Review the slide. Highlight that the Lexile reader measure is the score reported by the PMRN. The student’s Lexile score will be reported as a single number as well as a range. 31 31

32 Student Score Detail Box-
3-12 This is the full view of the Student Score Detail Box (SSDB) It lists the Grade and Year on the first line, the Student’s name and assessment period on the next line and then the student’s scores. In this box, is all of the same information presented on the main status report for the individual student and those scores not displayed on the main page. Student 5 would be replaced with the student’s name e.g., John Doe’s scores… It includes all scores for the Reading Comprehension task (FSP, Lexile, Percentile, Standard Score, Ability Score, and FCAT content areas) all scores for the Maze (percentile, SS, and Adjusted Maze Score) and for the WA (Percentile, SS, and WAAS Word Analysis Ability Score). This is also where a teacher will find the student’s responses on the Word Analysis task that were incorrect. It lists them as target word and then student response. OPM scores are not on this report because this report lists scores for a specific assessment period. It can be printed from the pop-up box link (point to the print button at the bottom) or the teacher can print the detail box for each student in the whole class with one link from the class status report. This is also an excellent report to include in a student’s cumulative folder. 32

33 Fall Winter Spring Grade RC Screen FSP 3 .64 .62 .75 .73 .78 .76 4 .66
Table 1: Correlations between the FCAT and both RC Screen and FSP Fall Winter Spring Grade RC Screen FSP 3 .64 .62 .75 .73 .78 .76 4 .66 73 .77 5 .69 .79 6 .70 .72 .74 7 .71 8 9 10 .67 33

34 Table 2: Screening Accuracy of the FAIR predicting FCAT success
Fall Winter Spring Grade FSP = 0.85 FSP = 0.70 % < Level 3 FCAT 3 .99 28 4 .95 .97 .98 .96 31 5 .94 33 6 40 7 .92 38 8 .82 .91 .81 51 9 .88 .87 59 10 .90 .80 69 34

35 The Common Core State Standards Text Complexity
Focus on Four Strands (reading, writing, speaking and listening, and language) The benefits of an integrated literacy approach (both in terms of reaching out to content areas beyond ELA and also in terms of research and media skills being integrated into the four strands) A focus on results rather than means (“the Standards leave room for teachers, curriculum developers, and states to determine how those goals should be reached and what additional topics should be addressed” (p. 4).)

36 Common Core State Standards Text Complexity
The Common Core State Standards places a strong emphasis on the role of text complexity in evaluating student readiness for college and careers. “The Common Core State Standards hinge on students encountering appropriately complex texts at each grade level in order to develop the mature language skills and the conceptual knowledge they need for success in school and life.” (p. 3)

37 Advantages to Common Core Standards
A focus on college and career readiness Inclusion of the four strands of English Language Arts: Reading Writing Listening and speaking Language The benefits of an integrated literacy approach – all educators have a shared responsibility for literacy instruction, regardless of discipline or content area. A focus on results rather than means – “the Standards leave room for teachers, curriculum developers, and states to determine how those goals should be reached and what additional topics should be addressed.” (p. 4) Efficiencies of scale – common standards allow for greater collaboration among states in the areas of: Professional development Resource development Teaching tools Focus on Four Strands (reading, writing, speaking and listening, and language) The benefits of an integrated literacy approach (both in terms of reaching out to content areas beyond ELA and also in terms of research and media skills being integrated into the four strands) A focus on results rather than means (“the Standards leave room for teachers, curriculum developers, and states to determine how those goals should be reached and what additional topics should be addressed” (p. 4).)

38 Text Complexity Included within the Standards is an enhanced focus on text complexity. Specifically, within reading standard #10: Anchor Standard: R.CCR.10 Read and comprehend complex literary and informational texts independently and proficiently. Example Grade-level Standard (6th grade): RI.6.10 By the end of the year, read and comprehend literary nonfiction in the grades 6-8 text complexity band proficiently, with scaffolding as needed at the high end of the range. As stated in the Standards: Note on range and content of student reading To build a foundation for college and career readiness, students must read widely and deeply from among a broad range of high-quality, increasingly challenging literary and informational texts. Through extensive reading of stories, dramas, poems, and myths from diverse cultures and different time periods, students gain literary and cultural knowledge as well as familiarity with various text structures and elements. By reading texts in history/social studies, science, and other disciplines, students build a foundation of knowledge in these fields that will also give them the background to be better readers in all content areas. Students can only gain this foundation when the curriculum is intentionally and coherently structured to develop rich content knowledge within and across grades. Students also acquire the habits of reading independently and closely, which are essential to their future success.

39 Guiding Questions What do the Common Core Learning Standards mean by text complexity? What is a text complexity band? and How do we ensure the texts our students are reading are in the appropriate text complexity band? This presentation seeks to answer these questions.

40 Overview of Text Text Complexity
Text complexity is defined by: Qualitative Qualitative measures – levels of meaning, structure, language conventionality and clarity, and knowledge demands often best measured by an attentive human reader. Quantitative Quantitative measures – readability and other scores of text complexity often best measured by computer software. Reader and Task Reader and Task considerations – background knowledge of reader, motivation, interests, and complexity generated by tasks assigned often best made by educators employing their professional judgment.

41 Quantitative Measures Ranges for Text Complexity Grade Bands
Common Core State Standards Quantitative Measures Ranges for Text Complexity Grade Bands What is a text complexity band?

42 Quantitative Measures Ranges for Text Complexity Grade Bands
Common Core State Standards Quantitative Measures Ranges for Text Complexity Grade Bands Text Complexity Grade Bands Suggested Lexile Range Suggested ATOS Book Level Range** K-1 2-3 450L – 790L 2.0 – 4.0 4-5 770L – 980L 3.0 – 5.7 6-8 955L – 1155L 4.0 – 8.0 9-10 1080L – 1305L 4.6 – 10.0 11-CCR 1215L – 1355L 4.8 – 12.0 What is a text complexity band?

43 Where do we find texts in the appropriate text complexity band?
We could…. Choose an excerpt of text from Appendix B as a starting place: Use available resources to determine the text complexity of other materials on our own. or… (Even choosing excerpts from Appendix B is less effective because it removes the reader and task considerations from the equation.)

44 Determining Text Complexity
A Four-step Process: Determine the quantitative measures of the text. Qualitative Quantitative Analyze the qualitative measures of the text. Reflect upon the reader and task considerations. Reader and Task Overview of the protocol Recommend placement in the appropriate text complexity band.

45 Step 1: Quantitative Measures
Measures such as: Word length Word frequency Word difficulty Sentence length Text length Text cohesion

46 Step 1: Quantitative Measures
The Quantitative Measures Ranges for Text Complexity: This document outlines the suggested ranges for each of the text complexity bands using: Lexile Text Measures ---or--- ATOS Book Levels (Accelerated Reader)

47 Step 1: Quantitative Measures
Let’s imagine we want to see where a text falls on the quantitative measures “leg” of the text complexity triangle, using either the Lexile text measures or the ATOS book level (or both). For illustrative purposes, let’s choose the text, Narrative of the Life of Fredrick Douglass.

48 Step 1: Quantitative Measures
Lexile Text Measure: 1080L ATOS Book Level: 7.9 In which of the text complexity bands would this text fall?

49 Quantitative Measures Ranges for Text Complexity Grade Bands
Common Core Learning Standards Quantitative Measures Ranges for Text Complexity Grade Bands Text Complexity Grade Bands Suggested Lexile Range Suggested ATOS Book Level Range** K-1 100L – 500L* 1.0 – 2.5 2-3 450L – 790L 2.0 – 4.0 4-5 770L – 980L 3.0 – 5.7 6-8 955L – 1155L 4.0 – 8.0 9-10 1080L – 1305L 4.6 – 10.0 11-CCR 1215L – 1355L 4.8 – 12.0 What is a text complexity band? * The K-1 suggested Lexile range was not identified by the Common Core State Standards and was added by Kansas. ** Taken from Accelerated Reader and the Common Core State Standards, available at the following URL:

50 Step 1: Quantitative Measures
Remember, however, that the quantitative measures is only the first of three “legs” of the text complexity triangle. Our final recommendation may be validated, influenced, or even over-ruled by our examination of qualitative measures and the reader and task considerations.

51 Step 1: Quantitative Measures
Additional Resources Lexile Measures and the Common Core State Standards Accelerated reader and the Common Core State Standards Coh-Metrix Coh-Metrix calculates the coherence of texts on a wide range of measures. It replaces common readability formulas by applying the latest in computational linguistics and linking this to the latest research in psycholinguistics.

52 Step 2: Qualitative Measures
Measures such as: Structure Language Demands and Conventions Knowledge Demands Levels of Meaning/Purpose

53 Common Core Standards Qualitative Features of Text Complexity
Structure (could be story structure and/or form of piece) Simple  Complex Explicit  Implicit Conventional Unconventional Events related in chronological order  Events related out of chronological order (chiefly literary texts) Traits of a common genre or subgenre  Traits specific to a particular discipline (chiefly informational texts) Simple graphics  sophisticated graphics Graphics unnecessary or merely supplemental to understanding the text  Graphics essential to understanding the text and may provide information not elsewhere provided

54 Common Core Standards Qualitative Features of Text Complexity
Language Demands: Conventionality and Clarity Literal  Figurative or ironic Clear  Ambiguous or purposefully misleading Contemporary, familiar  Archaic or otherwise unfamiliar Conversational  General Academic and domain specific Light vocabulary load: few unfamiliar or academic words Many words unfamiliar and high academic vocabulary present Sentence structure straightforward Complex and varied sentence structures Though vocabulary can be measured by quantifiable means, it is still a feature for careful consideration when selecting texts Though sentence length is measured by quantifiable means, sentence complexity is still a feature for careful consideration when selecting texts

55 Common Core Standards Qualitative Features of Text Complexity
Knowledge Demands: Life Experience (literary texts) Simple theme  Complex or sophisticated themes Single theme  Multiple themes Common everyday experiences or clearly fantastical situations  Experiences distinctly different from one’s own Single perspective  Multiple perspectives Perspective(s) like one’s own  Perspective(s) unlike or in opposition to one’s own

56 Common Core Standards Qualitative Features of Text Complexity
Knowledge Demands: Cultural/Literary Knowledge (chiefly literary texts) Everyday knowledge and familiarity with genre conventions required  Cultural and literary knowledge useful Low intertextuality (few if any references/allusions to other texts)  High intertextuality (many references/allusions to other texts

57 Common Core Standards Qualitative Features of Text Complexity
Levels of Meaning (chiefly literary texts) or purpose (chiefly informational texts) Single level of meaning Multiple levels of meaning Explicitly stated purpose  Implicit purpose, may be hidden or obscure

58 Step 2: Qualitative Measures
The Qualitative Measures Rubrics for Literary and Informational Text: The rubric for literary text and the rubric for informational text allow educators to evaluate the important elements of text that are often missed by computer software that tends to focus on more easily measured factors.

59 Step 2: Qualitative Measures
Because the factors for literary texts are different from information texts, these two rubrics contain different content. However, the formatting of each document is exactly the same. And because these factors represent continua rather than discrete stages or levels, numeric values are not associated with these rubric. Instead, six points along each continuum is identified: not suited to the band, early-mid grade level, mid-end grade level, early-mid grade level, mid-end grade level, not suited to band.

60 Step 2: Qualitative Measures
How is the rubric used? And how would Narrative of the Life of Fredrick Douglass fair when analyzed through the lens of the Text Rubric?

61 Step 2: Qualitative Measures

62 Step 3: Reader and Task Considerations such as: Motivation
Knowledge and experience Purpose for reading Complexity of task assigned regarding text Complexity of questions asked regarding text

63 Step 3: Reader and Task Ten Guiding Principles
Make close reading and rereading of texts central to lessons. Provide scaffolding that does not preempt or replace text. Ask text dependent questions from a range of question types. Emphasize students supporting answers based upon evidence from the text. Provide extensive research and writing opportunities (claims and evidence).

64 Step 3: Reader and Task Ten Guiding Principles
Offer regular opportunities for students to share ideas, evidence and research. Offer systematic instruction in vocabulary. Ensure wide reading from complex text that varies in length. 9. Provide explicit instruction in grammar and conventions. 10. Cultivate students’ independence.

65 Text Complexity Key to Student Reading Success
Text complexity matters because…. “making textbooks easier ultimately denies students the very language, information, and modes of thought they need most to move up and on.” -Marilyn Jager Adams

66 Text Requirements in Middle and High School
Many students are engaged in shallow reading, skimming text for answers, focusing only on details and failing to make inferences in order to integrate different parts of the text. Years of reading in this superficial way will cause a student’s reading ability to deteriorate. For many students the decline of text demands in the courses that they take has both an immediate and long term impact on student achievement.

67 What Are We Doing To Accelerate Success?

68 Just Read, Florida! New Professional Development The Comprehension Instructional Sequence
An instructional model based upon research evidence introduced this year to Florida’s teachers. The model assists teachers of students in grades 6-12 in implementing whole-class examination of difficult texts and build students’ specialized knowledge. This sequence helps students grasp textual nuances they would not understand on their own. It is a “text-dependent” approach, ensuring the close examination of key text details and utilizes complex text. Last school year JRF developed a state of the art professional development that alters the way teachers plan and deliver instruction. The quality of the teacher matters. Teaching reading is the job of an expert. Middle and High school teachers need additional professional development on the following: • Close reading and rereading of text with text based questioning- a significant percentage of questions/tasks must be text dependent • Students are guided to analyze dense arguments and information at the heart of complex literary non-fiction • Questions and tasks require the use of textual evidence, including supporting logical inferences from the text. • Questions and tasks require careful comprehension of the text before asking for further connections, evaluation, or interpretation. • Rather than emphasizing more general strategies and questions, text specific questions and tasks reinforce focus on the text and cultivate independence.   Teaching Students to Think as They Read

69 New: Next Generation Content Area Reading Professional Development
Facilitates the type of instruction needed to yield high outcomes in literacy for all students. Uses close reading, text based questions, text based discussions, and writing in response to reading to focus students on reading text closely to draw evidence from the text. Emphasizes reading deeply in multiple disciplines. Comprehension strategies are taught in an integrated fashion with instructional coherence and direct application. Fosters respect for the discipline and content while providing the necessary scaffolds for students to extract the meaning with deep understanding of the content being taught. This summer staff from JRF provided 4 days of staff development to train cadres of teachers in every school district. This staff development is provided so that teachers can become the reading intervention teacher of record for many students scoring at Level 2 on FCAT Reading. This staff development is built on the Comprehension Instructional Sequence. Funding is needed for follow-up professional development.

70 Additional Resources Appendix A - Qualitative Rubric for Text Complexity Appendix B - Common Core State Standards Text Exemplars


Download ppt "Instruction in Reading"

Similar presentations


Ads by Google