Presentation is loading. Please wait.

Presentation is loading. Please wait.

Institute on Beginning Reading Day 2: Evaluating Performance: Schoolwide Assessment of Student Performance.

Similar presentations


Presentation on theme: "Institute on Beginning Reading Day 2: Evaluating Performance: Schoolwide Assessment of Student Performance."— Presentation transcript:

1 Institute on Beginning Reading Day 2: Evaluating Performance: Schoolwide Assessment of Student Performance

2 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 2 Content Development Content developed by: Roland H. Good, Ph. D.Beth Harn, Ph. D.College of EducationUniversity of Oregon Edward J. Kame’enui, Ph. D.Deborah C. Simmons, Ph. D.Professor, College of EducationUniversity of Oregon Michael D. Coyne, Ph. D. University of Connecticut Prepared by: Patrick Kennedy-PaineKatie TateUniversity of Oregon

3 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 3 Acknowledgments  Oregon Department of Education  U.S. Department of Education, Office of Special Education Programs  Bethel School District, Eugene, Oregon Dr. Drew Braun, Dr. Carl Cole, Lori Smith, Rhonda Wolter, Administrators, Staff, and Students  Dr. Sharon Vaughn, University of Texas at Austin, Texas Center for Reading and Language Arts

4 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 4 Permissions  Some video clips are used with the permission of Reading Rockets, a project of Greater Washington Educational Telecommunications Association (WETA).  More information is available at: http://www.ReadingRockets.org/

5 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 5 Copyright  All materials are copy written and should not be reproduced or used without expressed permission of Dr. Edward J. Kame’enui or Dr. Deborah C. Simmons. Selected slides were reproduced from other sources and original references cited.

6 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 6 Objectives: What You Will Learn and Do The objectives of today’s session are to: 1.Differentiate purposes of assessment. 2.Delineate how the DIBELS assessment system differs from traditional assessment systems. 3.Use DIBELS to evaluate outcomes at the school, grade, class and student level. 4.Administer and score DIBELS. 5.Interpret DIBELS results. 6.Develop a plan to use DIBELS quarterly with all students. 7.Evaluate the current assessment system in your school.

7 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 7 1.Goals: What outcomes do we want for our students in our state, district, and schools? 2.Knowledge: What do we know and what guidance can we gain from scientifically based reading research? 3.Progress Monitoring Assessment: How are we doing? What is our current level of performance as a school? As a grade? As a class? As an individual student? 4.Outcome Assessment: How far do we need to go to reach our goals and outcomes? 5.Core Instruction: What are the critical components that need to be in place to reach our goals? 6.Differentiated Instruction: What more do we need to do and what instructional adjustments need to be made? Today’s Focus Guiding Questions

8 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 8 IBR Foundational Features: Translating Research into Practice Schoolwide: Each & All Prevention Oriented Scientifically Based Results Focused

9 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 9 Building an Effective Reading Program for All Students: Essential Components For Each Student Instruction Goals Assessment For All Students  Efficient  Informative at the School Class Individual Level

10 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 10 Answer the following questions based on what you learned in Day 1. 1.By implementing scientifically-based instructional practices within a prevention model, we will enable more students to be ________. 2.The goal of schoolwide reading model is to: a)Help schools build capacity and sustained use of scientifically based practices specifically tailored to their school b)Maximize the number of students being readers by the end of grade 3 c)Prevent individual children from experiencing reading frustration improving instruction for all d)All of the above Start-Up Activity: Reviewing Day 1

11 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 11 Answer the following questions based on what you learned in Day 1. 3.What is the primary assessment system we will use to evaluate our school’s progress in meeting the early literacy and reading needs of all children? _______________ 4.One way of achieving our goals is to systematically pace our instruction of the big ideas. We can determine when to introduce and how to sequence key instructional objectives by using: a)Lock-step following of the curricular program without linkage to student learning b)Curriculum maps c)Our instincts Start-Up Activity: Reviewing Day 1

12 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 12 Objectives: What You Will Learn and Do The objectives of today’s session are to: 1.Differentiate purposes of assessment. 2.Delineate how the DIBELS assessment system differs from traditional assessment systems. 3.Use DIBELS to evaluate outcomes at the school, grade, class and student level. 4.Administer and score DIBELS. 5.Interpret DIBELS results. 6.Develop a plan to use DIBELS quarterly with all students. 7.Evaluate the current assessment system in your school.

13 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 13 Reading Assessment for Different Purposes An effective, comprehensive reading program includes reading assessments for four purposes:  Outcome - Provides a bottom-line evaluation of the effectiveness of the reading program in relation to established performance levels.  Screening - Designed as a first step in identifying children who may be at high risk for delayed development or academic failure and in need of further diagnosis of their need for special services or additional reading instruction.

14 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 14 Reading Assessment for Different Purposes An effective, comprehensive reading program includes reading assessments for four purposes:  Diagnosis - Helps teachers plan instruction by providing in-depth information about students’ skills and instructional needs.  Progress Monitoring - Determines through frequent measurement if students are making adequate progress or need more intervention to achieve grade- level reading outcomes.

15 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 15 Role of Assessment  Role of Assessment: Video of Dr. Edward Kame’enui  Purpose of Timely Assessment

16 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 16 Role of Assessment  Role of Assessment: Video of Dr. Edward Kame’enui  Purpose of Timely Assessment: _____________________________________  How well do we want the lowest reader in each grade to read?  1 st Grade: 40 wpm minimum, 60 wpm desirable  2 nd Grade: 90 wpm  3 rd Grade: 110 wpm  What is the significance of reading this well? _______________________________ Assessing the quality of our investment Good indicator of comprehension

17 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 17 Outcome Assessment  Purpose: To determine level of proficiency in relation to norm or criterion.  When: Typically administered at end of year. Can be administered pre/post to assess overall growth.  Who: All students  Relation to instruction: Provides index of overall efficacy but limited timely instructional information.

18 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 18 Screening Assessment  Purpose: To determine children who are likely to require additional instructional support (predictive validity).  When: Early in the academic year or when new students enter school.  Who: All students  Relation to instruction: Most valuable when used to identify children who may need further assessment or additional instructional support.

19 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 19 Diagnostic Assessment  Purpose: To provide specific information on skills and strategy needs of individual students.  When: Following screening or at points during the year when students are not making adequate progress.  Who: Selected students as indicated by screening or progress monitoring measures or teacher judgment.  Relation to Instruction: Provided specific information on target skills; highly relevant.

20 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 20 Progress Monitoring Assessment  Purpose: Frequent, timely measures to determine whether students are learning enough of critical skills.  When: At minimum 3 times per year at critical decision making points.  Who: All students  Relation to Instruction: Indicates students who require additional assessment and intervention.

21 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 21 Objectives: What You Will Learn and Do The objectives of today’s session are to: 1.Differentiate purposes of assessment. 2.Delineate how the DIBELS assessment system differs from traditional assessment systems. 3.Use DIBELS to evaluate outcomes at the school, grade, class and student level. 4.Administer and score DIBELS. 5.Interpret DIBELS results. 6.Develop a plan to use DIBELS quarterly with all students. 7.Evaluate the current assessment system in your school.

22 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 22 Purposes of Assessment in the Schoolwide Model “Teaching without assessment is like driving a car without headlights.”  Assessment for all children must: 1. Focus on essential, important skills 2. Be instructionally relevant 3. Be efficient to administer 4. Be sensitive to change in skill performance 5. Measure fluency of performance DIBELS provide the feedback to ensure our program is meeting the needs of all children

23 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 23 Essential Features of DIBELS (Dynamic Indicators of Basic Early Literacy Skills) Preventing Reading Difficulties Through Early Identification  Dynamic – Responsive to Changes in Student Performance  Identifies students who need additional support  Evaluates student response to intervention  Indicators – Focused on an Essential Skill  Enables assessment to be efficient  Basic Early Literacy Skills – Relevant to Instructional Planning  Links essential literacy skills to prevent reading failure

24 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 24 Relation of DIBELS to Purposes of Assessment  Utility of DIBELS Purpose of AssessmentUtility ScreeningYes Progress MonitoringYes Diagnostic Possibly with expert teachers OutcomeSelected measures

25 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 25 The Need for Results-Focused Assessment  Instructional Time is Precious: Need to spend time teaching, not testing  DIBELS measures do not assess all aspects of reading  Short duration fluency-based measures  Some Skills are More Important Than Others:  Assesses skills predictive of later reading proficiency  Provides timely feedback to schools and teachers to enable responsive instruction  Allows early identification of students who need instructional support  Assesses whether children are learning enough

26 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 26 University of Oregon Research Team that Developed the DIBELS Measures:  Primary Researchers: Roland Good Ruth Kaminski  Contributing Researchers: Scott BakerJohn BrattenShaheen Chowdri Cheri CornachionePatricia CoyneShanna Davis Hank FienKathleen FlemingJerry Gruba Lisa Habedank StewartBeth HarnDiane Hill Rachell KatzJennie KnutsonKatherine Kohler Debby LaimonElida LopezAmbre ReMillard Karen RushDawn Sheldon-JohnsonMark Shinn Michelle ShinnSylvia SmithDavid VanLoo Joshua WallinJennifer Watson Acknowledgments

27 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 27 Acknowledgments DIBELS research was supported and funded by: Early Research Institute on Measuring Growth and Development (H180M10006) and Student-Initiated Grants (H023B90057; 90CD0819; H023B90057), funded by the U. S. Department of Education, Special Education Programs. Further information and research on the measures is available at: http://dibels.uoregon.edu

28 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 28 What DIBELS Assess: Critical Outcomes and Indicators  The NRP and NRC reports identified five essential skills or “Big Ideas”:  Phonological Awareness: The ability to hear and manipulate sounds in words.  Alphabetic Principle: The ability to associate sounds with letters and use these sounds to read words.  Accuracy and Fluency with Connected Text: The effortless, automatic ability to read words in connected text to develop understanding.  Vocabulary: The ability to understand (receptive) and use (expressive) words to acquire and convey meaning.  Comprehension: The complex cognitive process involving the intentional interaction between reader and text to extract meaning.

29 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 29 Assessing Each Big Idea with DIBELS Big IdeaDIBELS Measure Phonological Awareness Alphabetic Principle Fluency and Accuracy Vocabulary Comprehension Initial Sounds Fluency (ISF) Phonemic Segmentation Fluency (PSF) Nonsense Word Fluency (NWF) Oral Reading Fluency (ORF) Word Use Fluency (WUF) Oral Reading Fluency (ORF) & Retell Fluency (RTF)

30 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 30 Why Focus on Fluency? To gain meaning from text, students must read fluently.  Proficient readers are so automatic with each component skill (phonological awareness, decoding, vocabulary) that they focus their attention on constructing meaning from the print (Kuhn & Stahl, 2000).  Component skills need to be well developed to support understanding.  It is not enough to be simply accurate; the skill must be automatic.

31 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 31 Role of Automaticity or Fluency  Role of Automaticity or Fluency: Video of Dr. Reid Lyon

32 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 32 Role of Automaticity or Fluency  Role of Automaticity or Fluency: Video of Reid Lyon  The focus of reading instruction is not only on getting students to know sounds or letters but to: __________________  Building automaticity in the component skills is analogous to: _____________________ Get to the meaning Learning to ride a bike

33 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 33 First Grade Curriculum Map

34 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 34 A Qualitative Difference in Beginning Readers In one minute, we can obtain a reliable indicator of early reading proficiency. The two students require substantially different instruction toward the goal of being lifelong readers. I’ve thrown a lot of rocks into the lake by our cabin. Sometimes I think I’ve thrown in enough to fill the whole lake. But it never seems to get full. As you can tell, I like to throw rocks. But throwing rocks is always a lot more fun with Grandpa. He can make anything….

35 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 35 What Are the Skill Differences Between These Readers?  The “on-track” reader has a strategic approach to reading:  Alphabetic Principle: ___________________________  Fluency with connected text: ___________________________________ ____________  Other attributes: ____________________ decodes words she does not know. reads words with accuracy and speed to enable comprehension

36 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 36 What Are the Skill Differences Between These Readers?  The struggling reader does not have an effective strategy to gain access to the meaning of the passages:  Alphabetic Principle: ____________________________________ _____  Fluency with connected text: _____________________________  Other attributes: ____________________ Has an ineffective strategy for reading unknown words. Limited fluency deters comprehension

37 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 37 Prevention Oriented: Relation Between ORF and Other Outcome Measures  88% of students who met the end-of-first-grade ORF goal went on to meet or exceed Oregon’s State Benchmark Test in grade 3. OSA Reading/Literature, Spring, Grade 3 Play audio clip

38 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 38 Objectives: What You Will Learn and Do The objectives of today’s session are to: 1.Differentiate purposes of assessment. 2.Delineate how the DIBELS assessment system differs from traditional assessment systems. 3.Use DIBELS to evaluate outcomes at the school, grade, class and student level. 4.Administer and score DIBELS. 5.Interpret DIBELS results. 6.Develop a plan to use DIBELS quarterly with all students. 7.Evaluate the current assessment system in your school.

39 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 39 How Do We Change Reading Outcomes? 1.Earlier rather than later: prevention oriented 2.Schools not just programs 3.Results not just improvement 4.Science not just opinion

40 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 40 Results Focused: Evaluating Progress At Multiple Levels Schoolwide DIBELS can answer: 1.How are we doing as a school? 2.How are we doing at each grade? 3.How is each class doing? 4.How are individual students doing?

41 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 41 How Are We Doing as a School? How would you describe this school’s end-of-year first graders? Circle one of the following: a)All on-track b)Majority on-track c)Some on-track 43%36% End of Year Histogram - Oral Reading Fluency End of Year Benchmark: 40 CWPM Low Risk Some Risk At Risk

42 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 42 What Skills Did These First Graders Have at the End of Kindergarten?  Almost half the kindergartners finished the year without strong skills in phonological awareness  Making these students ______ for reading difficulties, a prediction in this case that came true. at risk 60% 16% End of Year Benchmark: 35 correct phonemes End of Year Histogram - Phoneme Segmentation Fluency Established Emerging Deficit

43 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 43 DIBELS Tell Us if Odds Are in Our Favor Scatter Plot: The Relation Between Phonological Awareness and Oral Reading Fluency Odds of being an Established Reader on ORF in May of first grade when Established on PSF in May of kindergarten is 37 out of 44, or 87%. Odds of being an Established Reader on ORF in May of first grade when Deficit on PSF in May of kindergarten is 1 out of 6, or 16%. Students in this section had established alphabetic principle skills at the middle of First Grade and ended the year as readers. Students in this section had deficit alphabetic principle skills at the middle of First Grade and ended the year as at risk readers. Play audio clip

44 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 44 A Compass is Only Helpful If We Know Our Destination (Outcomes)  Each measure has a scientifically-based goal  Two parts to every goal:  How much / How well?  By when? MeasureHow Much?By When? Initial Sounds Fluency 25 or moreMiddle of K Phonemic Segmentation Fluency 35 or moreEnd of K Nonsense Word Fluency 50 or moreMiddle of First Oral Reading Fluency 1st: 40 or more 2nd: 90 or more 3rd: 110 or more 1st: End of Year 2nd: End of Year 3rd: End of Year

45 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 45 Stepping Stones of Early Literacy Video of Dr. Roland Good

46 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 46 When to Administer DIBELS  Monitoring student skill development

47 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 47 Allocating Resources More Efficiently  Early identification of students most in need of additional instructional support Mid-Year Kindergarten Class List

48 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 48 How to Use DIBELS in Your School: Schoolwide Administration  Designed to Collect Data Efficiently at the School Level  Short duration: 1-minute administration  Repeatable with 20 alternate forms  Reproducible and convenient to use  Fluency based

49 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 49 Training: Standardized Method of Administration  For scores to be useful, we must administer the measures according to standardized administration and scoring directions.  Presenting each measure:  Present the directions as written  Use the specific materials  Timing each measure:  Use a stopwatch  Scoring each measure:  Follow scoring rules for each measure  Score immediately after completing  Standardization provides each child an equal opportunity to display skills.  Engage student to do his or her best

50 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 50 Separating Teaching & Testing Time  Scores will be used to assist in making instructional decisions  Therefore, we must administer the measures without:  Assisting the student during the task  Modifying the task, materials, or time Standardized, reliable data collection and scoring are essential!

51 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 51 Objectives: What You Will Learn and Do The objectives of today’s session are to: 1.Differentiate purposes of assessment. 2.Delineate how the DIBELS assessment system differs from traditional assessment systems. 3.Use DIBELS to evaluate outcomes at the school, grade, class and student level. 4.Administer and score DIBELS. 5.Interpret DIBELS results. 6.Develop a plan to use DIBELS quarterly with all students. 7.Evaluate the current assessment system in your school.

52 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 52 Learn the Measures  Three things to consider for each measure:  What essential skill does it assess?  What is the appropriate time and grade?  What is the goal (how much, by when)?

53 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 53 Phonemic Segmentation Fluency (PSF):  What important skill does it assess? Phonological Awareness  The ability to hear and manipulate sounds in words at the phrase level  What is the appropriate time and grade?  Mid-year kindergarten through first grade  What is the goal?  How well? 35 phonemes or more  By when? End of kindergarten

54 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 54 What PSF Looks Like As you view the video, attend to:  The child:  Characterize task performance (circle one):  Complete Segmentation with Fluency  Partial Segmentation with Fluency  Partial Segmentation with No Fluency  Some Segmentation with Errors  The examiner:  Comfortable with materials  Comfortable with student  Comfortable with administration

55 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 55 What PSF Looks Like

56 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 56 How Do We Administer and Score the PSF Measure?  Materials: 1.Examiner copy of word list with phoneme scoring columns. Student has no materials when assessing phonological awareness. 2.Stopwatch 3.Pencil  Preparing the Student: 1.Good testing conditions (e.g., lighting, quiet, comfortable) 2.Provide model in standardized manner and follow correction procedures as necessary

57 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 57 How Do We Administer and Score the PSF Measure? 1. Place the segmentation word list in front of you but shield it so the student cannot see what you record. 2. Say these specific directions to the student: I am going to say a word. After I say it, you tell me all the sounds in the word. So, if I say “Sam,” you say /s/ /a/ /m/. Let’s try one. (One second pause.) Tell me the sounds in “mop.” "OK. Here is your first word."

58 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 58 Maximizing Administration Time  Stopwatch:  Present the first word and start the stopwatch and time for 1 minute.  Scoring:  Underline each different, correct sound segment produced. (See specific scoring rules and examples.)  Put a slash (/) through sounds produced incorrectly.  Maintaining momentum:  As soon as the student is finished saying the sounds, present the next word.  Allow the student 3 seconds for each sound segment.  Discontinue:  If a student has not given any correct sound segments in the first 5 words, discontinue the task and record a score of zero (0).  Ending testing:  At the end of 1 minute, stop timing and calculate the number of correct phonemes per minute.

59 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 59 Scoring Rules for PSF Correct Segmentation:  A correct sound segment is any different, correct part of the word. For example, the sound /t/ is a correct segment of "trick", as are /tr/ and /tri/ (see rule 2, following page).  Examiner says "trick," student says "t...r...i...k"  Examiner says "cat," student says "k...a...t" STUDENTSCORINGCORRECT WORD:SAYS:PROCEDURE:SEGMENTS trick“t...r...i...k”/t/ /r/ /i/ /k/ 4/4 cat“k...a...t”/k/ /a/ /t/ 3/3

60 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 60 Elongating Sounds Correct Segmentation:  No need for an audible pause between the sounds to receive credit.  If you can hear each individual sound when the student runs them together, score each sound as correct.  Use your professional judgment based on the response and your knowledge of your program. If still not sure, do not give credit STUDENT SCORING CORRECT WORD:SAYS: PROCEDURE: SEGMENTS rest“rrrreeeessssttt”/r/ /e/ /s/ /t/ 4 /4

61 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 61 Errors in Segmenting: No Segmentation No Segmentation:  If student repeats the entire word, no credit is given for any correct parts.  Circle the word to indicate no segmented response was given. STUDENT SCORING CORRECT WORD:SAYS: PROCEDURE: SEGMENTS trick“trick”/t/ /r/ /i/ /k/ 0/4 cat“cat”/k/ /a/ /t/ 0/3

62 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 62 Errors in Segmenting: Incomplete Segmentation Incomplete segmentation:  Student is given partial credit for each sound segment produced correctly, even if student has not segmented at the phoneme level.  The underline indicates the size of the sound segment.  For example: Examiner says “trick,” student says “tr...ick” Examiner says “cat,” student says “c...at” STUDENTSCORINGCORRECT WORD:SAYS:PROCEDURE:SEGMENTS trick“tr...ik”/t/ /r/ /i/ /k/ 2/4 cat“c…at”/k/ /a/ /t/ 2/3

63 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 63 Errors in Segmenting: Overlapping Sounds Overlapping :  Student receives credit for each different, correct sound segment of the word.  Underline the different sound segments produced  For example: Examiner says “trick,” student says “tri...ick” Examiner says “cat,” student says “c...cat” STUDENTSCORINGCORRECT WORD:SAYS:PROCEDURE:SEGMENTS trick“tri...ick”/t/ /r/ /i/ /k/ 2/4 cat“c…cat”/k/ /a/ /t/ 1/3

64 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 64 Errors in Segmenting: Omission of Sounds Omission:  Student does not receive credit for sound segments not produced. If student provides the initial sound only, be sure to wait 3 seconds for elaboration. STUDENTSCORINGCORRECT WORD:SAYS:PROCEDURE:SEGMENTS trick“t...ik”/t/ /r/ /i/ /k/ 2/4 cat“c” (3 seconds)/k/ /a/ /t/ 1/3

65 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 65 Errors in Segmenting: Mispronunciation of Sounds Mispronunciation:  Student does not receive credit for sound segments that are mispronounced.  Put a slash (/) through the incorrect sounds.  For example, there is no /ks/ sound in the word "trick." STUDENTSCORINGCORRECT WORD:SAYS:PROCEDURE:SEGMENTS trick“t...r...i...ks”/t/ /r/ /i/ /k/ 3/4 cat“b…a...t”/k/ /a/ /t/ 2/3

66 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 66 Student Characteristics Pronunciation & Dialect:  Student is not penalized for imperfect pronunciation due to dialect or articulation.  For example, if the student says /r/ /e/ /th/ /t/ for "rest" because of articulation difficulties, give full credit. Use professional judgment and prior knowledge of the student’s speech pattern to assess skill performance.

67 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 67 Student Characteristics Schwa Sounds:  Schwa sounds (/u/) added to consonants are not counted as errors. STUDENTSCORINGCORRECT WORD:SAYS:PROCEDURE:SEGMENTS trick“tu...ru...i...ku”/t/ /r/ /i/ /k/ 4/4 cat“ku...a...tu”/k/ /a/ /t/ 3/3

68 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 68 Let’s Try Again Total 35

69 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 69 Analyzing the Observation for Instructional Implications  Current Skills  Emerging phonological awareness at the phoneme level.  Strong on initial and final consonants and medial vowels.  Inconsistent with the task.  Instructional Needs  Integrate with alphabetic principle instruction.  Need more practice to build automaticity.

70 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 70 Tips for Scoring  Score what you hear!  Practice with at least 7 students before using the scores to make programming decisions.  One sound won’t make a major difference in skill assessment, but pondering for 5 seconds on whether to score 2 or 3 phonemes on a response will.  Look over words you are presenting to increase the pacing.  Practice phonemes in the booklet to increase reliability and consistency in scoring.

71 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 71 Breakout Activity: Practicing the Measure  Locate the “Phonemic Segmentation Fluency Breakout Activity” 1.Form a 3-person group 2.Assign roles:  Examiner  Student  Observer 3.Practice administering measure (3 rounds)

72 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 72 Initial Sounds Fluency (ISF):  What important skill does it assess? Phonological Awareness  The ability to hear and manipulate sounds in words.  What is the appropriate time and grade?  Beginning of the year, kindergarten  What is the goal?  How well? 25 phonemes or more  By when? Middle of kindergarten

73 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 73 What ISF Looks Like  As you view the video, attend to:  The child:  Characterize task performance (circle one):  Sound Isolation with Fluency  Sound Isolation with Limited Fluency  Sound Recognition with Limited Fluency  Some Sound Recognition with Errors  The examiner:  Comfortable with materials  Comfortable with student  Comfortable with administration

74 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 74 What ISF Looks Like

75 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 75 How Do We Administer and Score the ISF Measure?  Materials: 1.Examiner probe 2.Student picture pages 3.Stopwatch 4.Pencil  Preparing the student:  Good testing conditions (e.g., lighting, quiet, comfortable)  Provide model in standardized manner and follow correction procedures as necessary

76 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 76 How Do We Administer and Score the ISF Measure? 1.Place student copy of 4 randomized pictures in front of child. 2.Say these specific directions to the child: “This is mouse, flowers, pillow, letters (point to each picture while saying its name). Mouse (point to mouse) begins with the sound /m/. Listen, /m/, mouse. Which one begins with the sounds /fl/?"

77 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 77 How Do We Administer and Score the ISF Measure?  Correct Response on Sample Item: Student points to flowers, you say: “Good. Flowers begins with the sounds /fl/.”  Incorrect Response: “Flowers (point to flowers) begins with the sounds /fl/. Listen, /fl/, flowers. Let's try it again. Which one begins with the sounds /fl/?”

78 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 78 How Do We Administer and Score the ISF Measure?  "Pillow (point to pillow) begins with the sound /p/. Listen, /p/, pillow. What sound does letters (point to letters) begin with?"  Correct Response: If the student says /l/ you say: “Good. Letters begins with the sound /l/.”  Incorrect Response: If the student says any other response, you say: “Letters (point to letters) begins with the sound /l/. Listen, /l/, letters. Let's try it again. What sound does letters (point to letters) begin with?”  Then you say: "Here are some more pictures. Listen carefully to the questions."

79 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 79 Maximizing Administration Time  Stopwatch:  Read the question, start stopwatch. After child gives response, stop stopwatch. Record the total time to answer each of the 16 questions.  When the examiner is talking, the watch is not running.  Scoring:  Score is correct or incorrect (see specific scoring rules and examples).  Maintaining momentum:  Make sure to introduce each picture page.  Allow student 5 seconds to answer each question.  Discontinue:  If a student gets no items correct in the first 5 items, discontinue the task and record a score of zero (0).  Ending testing:  After administering all 16 items, record the total duration of thinking/response time found on your stopwatch.  Count number of items correct.  Calculate final score (see formula).

80 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 80 Scoring Rules for ISF  Identification Responses (“Which picture begins with…?”)  If the child points to the correct picture or names it, score as correct.  If the child names or renames the picture with a word that begins with the target sound, score as correct.

81 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 81 Scoring Rules for ISF  Identification Responses (“Which picture begins with…?”)  If the child points to the correct picture or names it, score as correct.  If the child names or renames the picture with a word that begins with the target sound, score as correct.

82 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 82 Scoring Rules for ISF  Production Responses (“What sound does …. begin with?”)  Correct Initial Sound or Sounds: If the word starts with an initial consonant sound, the child can respond with the first consonant or consonant-consonant blend. For example, if the word is “clock,” a correct initial sound would be /c/ or /cl/. The student must give the sound, not the letter name.

83 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 83 Let’s Try Again

84 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 84 Analyzing the Observation for Instructional Implications  Current Skills  Emerging phonological awareness at the initial sound level.  Inconsistent production for initial sounds.  Very accurate on identification of sounds.  Instructional Needs  Develop overall phonological awareness at the phoneme level.  Integrate skills in phonological awareness with alphabetic principle.

85 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 85 Tips for Scoring  Make sure to introduce each picture page.  Score what you hear!  Practice with at least 7 students before using the scores to make programming decisions.  Practice with stopwatch.  Time how long it takes student to answer question.  Make sure to record the total time at the end.  Look over the words and pictures you are presenting to increase pacing.

86 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 86 Quick Review  PSF and ISF assess what big idea?  Phonological awareness: Ability to hear and manipulate sounds in words.  When do we want students to have completely established skills in phonological awareness at the phoneme level?  End of kindergarten (a score of 35 or more on the PSF measure)  Why? PA is not enough to make a reader… but it is predictive.  (see next pages for kindergarten curriculum maps)

87 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 87 Quick Review

88 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 88 Moving From Sound to Print: Mapping Phonemes to the Print

89 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 89 Relation of PA to the Alphabetic Principle  The odds of having established alphabetic principle skills in time, given student had established PA skills at the end of kindergarten was 29 of 38, or 76%.  The odds of having established alphabetic principle skills in time, given student had limited PA skills at the end of Kindergarten was 0 of 2, or 0%. Phonological awareness does not guarantee proficiency on the alphabetic principle, but the skills are highly linked. Play audio clip

90 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 90  What is the Alphabetic Principle? The ability to associate sounds with letters and use these sounds to read words.  Comprised of two parts:  Alphabetic Understanding: Letter-sound correspondences.  Phonological Recoding: Using systematic relationships between letters and phonemes (letter- sound correspondence) to retrieve the pronunciation of an unknown “printed string” or to spell.  (see next page for first grade curriculum map) Role of Alphabetic Principle: Mapping the Phonemes to Print

91 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 91 Role of Alphabetic Principle: Mapping the Phonemes to Print

92 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 92 Role of Alphabetic Principle  Role of Alphabetic Principle: Video of Dr. Louisa Moats

93 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 93 Role of Alphabetic Principle  Role of Alphabetic Principle: Video of Louisa Moats  If students can decode nonsense words then students understand:  Words are made up of sounds  Sound-symbol correspondence  Structure of words  People who are proficient at reading nonsense words are better at: _________________ Reading for meaning

94 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 94 Nonsense Word Fluency (NWF):  What important skill does NWF assess?  Alphabetic Principle:The ability to associate sounds with letters and use these sounds to read words.  What is the appropriate time and grade?  Middle of the year in kindergarten and throughout first grade  What is the goal?  First Grade:  How well? 50 letter-sounds or more  By when? Middle of first grade  Kindergarten:  How well? 25 letter-sounds or more by end of kindergarten

95 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 95 What NWF Looks Like  As you view the video, attend to:  The child:  Characterize task performance (circle one):  Reads at the word level with Fluency  Reads at the word level with Limited Fluency  Reads at the sound level with Fluency  Reads at the sound level with Limited Fluency  The examiner:  Comfortable with materials  Comfortable with student  Comfortable with administration

96 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 96 What NWF Looks Like

97 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 97 How Do We Administer and Score the NWF Measure?  Materials: 1.Examiner probe 2.Student pages (practice page “sim lut” and test page) 3.Stopwatch 4.Pencil  Preparing the student:  Good testing conditions (e.g., lighting, quiet, comfortable)  Provide the model in standardized manner and follow correction procedures as necessary

98 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 98 How Do We Administer and Score the NWF Measure? Say these specific directions to the child: “Look at this word (point to the first word on the practice probe). It’s a make-believe word. Watch me read the word: (point to the letter “s”) /s/, (point to the letter “i”) /i/, (point to the letter “m”) /m/ “sim” (run your finger fast through the whole word). I can say the sounds of the letters, /s/ /i/ /m/ (point to each letter), or I can read the whole word “sim” (run your finger fast through the whole word). “Your turn to read a make-believe word. Read this word the best you can (point to the word “lut”). Make sure you say any sounds you know.”

99 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 99 How Do We Administer and Score the NWF Measure?

100 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 100 Place the student copy of the probe in front of the child. Here are some more make- believe words (point to the student probe). Start here (point to the first word) and go across the page (point across the page). When I say “begin,” read the words the best you can. Point to each letter and tell me the sound or read the whole word. Read the words the best you can. Put your finger on the first word. Ready, begin. How Do We Administer and Score the NWF Measure? Student Copy

101 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 101 Maximizing Administration Time  Stopwatch:  Start watch after student says the first word/sound and time for 1 minute.  Scoring:  Underline each correct letter sound produced (see specific scoring rules and examples).  Slash each incorrect letter sound produced.  Maintaining momentum:  Allow the student 3 seconds for each letter sound. After 3 seconds, provide the sound to keep the student moving.  Discontinue:  If a student does not get any correct in the first row, discontinue the task and record a score of zero (0).  Ending testing:  At the end of 1 minute, put a bracket after the last letter-sound/word produced and calculate the total letter-sounds correct in one minute.

102 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 102 Scoring Rules for NWF 1.Correct Letter Sounds A correct letter sound is scored as the most common sound in English. – For example, all the vowels are scored for the short sound and the most common sound for the letter “c” is /k/. See pronunciation guide for remaining letter sounds. 2.Marking the booklet Underline exactly the way the student completes task.  For example, if the student goes sound-by-sound, underline each letter individually. If the student reads the target as a whole word, underline the entire word.

103 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 103 Scoring Rules for NWF 3.Partially Correct Responses If a word is partially correct, underline the letter sounds produced correctly. Put a slash (/) through the letter if the letter sound is incorrect.  For example, if stimulus word is "sim" and student says "sam," the letters "s" and "m" would be underlined because those letter sounds were produced correctly, giving a score of 2. 4.Repeated sounds Letter sounds pronounced twice while sounding out the word are given credit only once.  For example, if stimulus word is "sim" and the student says /s/ /i/ /im/, the letter "i" is underlined once and the student receives 1 point for the phoneme "i" even though the letter "i" was pronounced correctly twice (a total of 3 for the entire word).

104 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 104 Scoring Rules for NWF 5.3-second rule - sound by sound If student hesitates for 3 seconds on a letter, score the letter sound incorrect, provide the correct letter sound, point to the next letter, and say, "What sound?"  This prompt may be repeated. For example, if the stimulus word is "tob" and the student says /t/ (3 seconds), prompt by saying, "/o/ (point to b) What sound?" 6.3-second rule - word by word If student hesitates for 3 seconds on a word, score the word incorrect, provide the correct word, point to the next word, and say, "What word?"  This prompt may be repeated. For example, if the stimulus words are "tob dos et" and the student says, "tob" (3 seconds), prompt by saying "dos (point to et) What word?"

105 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 105 Scoring Rules for NWF 7.Insertions Insertions are not scored as incorrect. For example, if the stimulus word is "sim" and the student says "stim," the letters "s" "i" and "m" would be underlined and full credit given for the word, with no penalty for the insertion of /t/. 8.Skipping Rows If student skips an entire row, draw a line through the row and do not count the row in scoring. 9.Self-corrections If student makes an error and then self- corrects within 3 seconds, write "SC" above the letter and count it as correct.

106 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 106 Let’s Try Again: Practice Scoring 12 10 7 Total 29 3 sec. d un

107 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 107 Analyzing the Observation for Instructional Implications  Current Skills  Approaches at the whole word level initially  Few letter-sound errors  Can blend sounds together to the word level  Instructional Needs  Increase automaticity for all letter-sounds  Increase automaticity in phonological recoding (“fof” instead of /f/ /o/ /f/)

108 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 108 Breakout Activity  Locate the “Nonsense Word Fluency Breakout Activity”  Form a 3-person group  Assign roles:  Examiner  Student  Observer  Practice administering measure (3 rounds)

109 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 109 Tips for Scoring  Score for the most common sounds of the letters.  Short vowels: i (big), e (beg), a (bag), u (bug), o (bog)  “Hard” sounds: “c” = /k/, “g” = /g/, “j” = /j/  A point for each letter, whether it is sound-by- sound or read as a whole word.  Score what you hear!  Underline exactly the way the student completes the task.  Practice with at least 7 students before using the scores to make programming decisions.  Look over words you are presenting to increase pacing.

110 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 110 Letter Naming Fluency (LNF):  What important skill does LNF assess?  LNF not directly linked to a Big Idea: Used as a risk indicator  What is the appropriate time and grade?  Through kindergarten and fall of first grade  What is the goal?  While letter naming is a good predictor of early reading success, knowledge of letter sounds is more important to word reading.  Research indicates a score of 8 or below in the beginning of kindergarten is predictive of later reading difficulty.

111 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 111 What LNF Looks Like

112 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 112 How Do We Administer and Score the LNF Measure?  Materials: 1.Examiner probe 2.Student page 3.Stopwatch 4.Pencil  Preparing the student:  Good testing conditions (e.g., lighting, quiet, comfortable)  Provide the model in standardized manner and follow correction procedures as necessary

113 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 113 How Do We Administer and Score the LNF Measure? Say these specific directions to the child: "Here are some letters" (point). "Tell me the names of as many letters as you can. When I say 'begin,' start here" (point to first letter in upper left hand corner) "and go across the page" (point). "Point to each letter and tell me the name of that letter. Try to name each letter. If you come to a letter you don't know, I'll tell it to you. Put your finger on the first letter. Ready?"

114 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 114 Maximizing Administration Time  Stopwatch:  Start watch after student says the first letter name and time for 1 minute.  Scoring:  Slash each incorrect letter name produced.  Maintaining momentum:  Allow student 3 seconds for each letter name; after 3 seconds, say the name to keep the student moving.  Discontinue:  If student does not get any correct in the first row, discontinue the task and record a score of zero (0).  Ending testing:  At the end of 1 minute, put a bracket after the last letter-name produced and calculate the total letter-names correct in 1 minute.

115 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 115 Scoring Rules for LNF 1.Correct Letter Names Student must say the correct letter name to receive credit. – If the student provides the letter sound rather than the letter name, say, "Remember to tell me the letter name, not the sound it makes." This prompt may be provided only once. 2.Self-corrections If student makes an error and self- corrects within 3 seconds, write "SC" above the letter and do not count as an error. 3.Skipping Rows If student skips an entire row, draw a line through the row and do not count the row when scoring.  Skipped or omitted letters are not counted in scoring.

116 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 116 Tips for Scoring  Score for the letter names.  If student skips a row, follow the student’s lead and keep going.  Give the student 3 seconds for each letter.  Score what you hear!  Practice with at least 7 students before using the scores to make programming decisions.

117 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 117 Oral Reading Fluency (ORF):  What important skill does it assess?  Fluency and accuracy with connected text: The effortless, automatic ability to read words in connected text leads to understanding.  What is the appropriate time and grade?  Middle of first grade through third grade  What is the goal:  To be fluent at the skill by end of first grade.  How well? 40 correct words or more  By when? End of first grade  What about second grade?  How well? 90 correct words or more  What about third grade?  How well? 110 correct words or more

118 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 118 ORF Benchmark Levels

119 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 119 Instructional Priorities

120 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 120 Importance of Fluency with Connected Text  The ability to accurately and quickly apply word reading strategies to reading connected text. Automatic and fluent reading allows students to allocate cognitive resources to comprehension.  “Fluency may be almost a necessary condition for good comprehension and enjoyable reading experiences.” (Nathan & Stanovich, 1991)  Oral reading fluency will not tell you everything you need to know about student reading performance. However, there is a strong relationship between oral reading fluency and comprehension.

121 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 121 Role of Automaticity or Fluency  Role of Automaticity or Fluency: Video of Louisa Moats

122 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 122 Role of Automaticity or Fluency  Role of Automaticity or Fluency: Video of Louisa Moats  Why do nonfluent readers “get worn out” after reading for a period of time?  ________________________________________ ______  ________________________________________ ________________________________________  ___________________________________ ______________________________________ ______ too much attention devoted to figuring out words takes too long to get to the end of passage and student can’t remember the beginning lose the sense of the passage as they struggle, pause, and make word-reading errors

123 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 123 Fluent Readers Display Orchestrated Reading Skills  Fluent readers are able to:  Focus their attention on understanding the text  Synchronize skills of decoding, vocabulary, and comprehension  Read with speed and accuracy  Interpret text and make connections between the ideas in the text  Nonfluent readers:  Focus attention on decoding  Alter attention to accessing the meaning of individual words  Make frequent word reading errors  Have few cognitive resources left to comprehend

124 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 124 Frustration: Reading With Poor Word Recognition Reading with 80% Accuracy Impact on Comprehension? Impact on Fluency?

125 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 125 What ORF Looks Like

126 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 126 How Do We Administer and Score the ORF Measure?  Materials: 1.Examiner probe 2.Student passages 3.Stopwatch 4.Pencil  Preparing the student:  Good testing conditions (e.g., lighting, quiet, comfortable) Say these specific directions to the child: “Please read this (point) out loud. If you get stuck, I will tell you the word so you can keep reading. When I say "stop," I may ask you to tell me about what you read, so do your best reading. Start here (point to the first word of the passage). Begin.”

127 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 127 How Do We Administer and Score the ORF Measure? Say these specific directions to the child: “Please read this (point) out loud. If you get stuck, I will tell you the word so you can keep reading. When I say "stop," I may ask you to tell me about what you read, so do your best reading. Start here (point to the first word of the passage). Begin.”

128 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 128 Maximizing Administration Time  Stopwatch:  Start watch after student says the first word and time for 1 minute.  Scoring:  Slash each word produced incorrectly.  Maintaining momentum:  Allow student 3 seconds for each word. After 3 seconds, say the word to keep the student moving.  Discontinue:  If student does not get any correct in the first row, discontinue the task and record a score of zero (0).  If student scores less than 10 on the first passage, do not administer the other two passages.  Ending testing:  At the end of 1 minute, put a bracket after the last word produced and calculate the number of correct words in one minute.

129 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 129 Scoring Rules for ORF: Scoring Directions are Similar to Marston, D. (1989) 1.Correctly Read Words are pronounced correctly. A word must be pronounced correctly given the context of the sentence.  Example: The word “read” must be pronounced /reed/ when presented in the context of the following sentence: Ben will read the story.WRC = 5 not as: “Ben will red the story.”WRC = 4 2.Self-corrected Words are counted as correct. Words misread initially but corrected within 3 seconds are counted as correct.  Example: Dad likes to watch sports.WRC = 5 read as: “Dad likes to watch spin...(3 seconds)…sports.”WRC = 5

130 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 130 Scoring Rules for ORF 3.Repeated Words are counted as correct. Words said over again correctly are ignored.  Example: I have a goldfish.WRC = 4 read as: “I have a...have a goldfish.”WRC = 4 4.Dialectic variations in pronunciation that are explainable by local language norms are not errors.  Example: We took the short cut.WRC = 5 read as: “We took the shot cut.”WRC = 5

131 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 131 Scoring Rules for ORF 5.Inserted Words are ignored. When students add extra words, they are not counted as correct words nor as reading errors.  Example: I ate too much.WRC = 4 read as: “I ate way too much.” WRC = 4 6.Mispronounced or Substituted Words are counted as incorrect.  Example: She lives in a pretty house.WRC = 6 read as: “She lives in a pretty home.”WRC = 5

132 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 132 Scoring Rules for ORF 7.Omitted/Skipped Words are counted as errors.  Example: Mario climbed the old oak tree.WRC = 6 read as: “Mario climbed the tree.”WRC = 4

133 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 133 Scoring Rules for ORF Words must be read in accordance with the context of the passage 8.Hyphenated Words count as two words if both parts can stand alone as individual words. Hyphenated words count as one word if either part cannot stand alone as an individual word. 9.Numerals and Dates must be read correctly in the context of the sentence. 10.Abbreviations must be read as pronounced in normal conversation. For example, “TV” could be read as "teevee" or "television," but “Mr.” must be read as "mister."

134 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 134 Breakout Activity  Locate the “Oral Reading Fluency Breakout Activity”  Form a 3-person group  Assign roles:  Examiner  Student  Observer  Practice administering measure (3 rounds)

135 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 135 Tips for Scoring  Student must read exactly what is on the page.  Self-corrections and insertions are ignored and not counted as errors.  Simply slash errors until you feel comfortable writing in the error types.  Score what you hear!  Practice with at least 7 students before using the scores to make programming decisions.  Look over passages you are presenting to ensure pacing is efficient.  Use the middle score of the three passages read to assess the student’s skill.  Have student read all three passages in one sitting

136 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 136 Kindergarten Benchmark Assessment

137 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 137 Grade 1 Benchmark Assessment

138 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 138 Objectives: What You Will Learn and Do The objectives of today’s session are to: 1.Differentiate purposes of assessment. 2.Delineate how the DIBELS assessment system differs from traditional assessment systems. 3.Use DIBELS to evaluate outcomes at the school, grade, class and student level. 4.Administer and score DIBELS. 5.Interpret DIBELS results. 6.Develop a plan to use DIBELS quarterly with all students. 7.Evaluate the current assessment system in your school.

139 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 139 Student Performance: Are We Making Progress? 28% Low risk for reading difficulties 34% Some risk for reading difficulties 38% At risk for reading difficulties End of Year Histogram - ORF, Year 1

140 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 140 Student Performance: Are We Making Progress? 57% Low risk for reading difficulties 20% Some risk for reading difficulties 22% At risk for reading difficulties End of Year Histogram - ORF, Year 2 After changes in curricular program, instruction, time, professional development:

141 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 141 Student Performance: Are We Making Progress? After 4 years of sustained focused effort: Cross-Year Boxplot Play audio clip

142 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 142 Class List Reports: Identifying At-Risk Students in the Middle of First Grade

143 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 143 DeficitAt Risk EmergingSome Risk EstablishedLow Risk Final Benchmark Goals and Later Quarterly Benchmark Goals Instructional Status Terminology For Each Measure

144 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 144 Critical Values & Progressive Benchmarks

145 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 145 Critical Values & Progressive Benchmarks

146 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 146 Critical Values & Progressive Benchmarks

147 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 147 Quick Review  What are the two measures used to assess phonological awareness? __________  What is the only measure not administered for a full 60 seconds? __________  Which measure do we use as a risk indicator for reading difficulty, but is not directly linked to a big idea of early literacy? _________________  This measure has students read made-up words to assess phonetic analysis skills and avoid the chance the student has the word memorized. ______________  Which measure has the strongest linkage to reading comprehension without a direct assessment of it? ______________ ISF & PSF ISF LNF NWF ORF

148 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 148 Benchmarks and Levels of Low Risk for Each DIBELS Measure ISF PSF NWF ORF

149 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 149 Objectives: What You Will Learn and Do The objectives of today’s session are to: 1.Differentiate purposes of assessment. 2.Delineate how the DIBELS assessment system differs from traditional assessment systems. 3.Use DIBELS to evaluate outcomes at the school, grade, class and student level. 4.Administer and score DIBELS. 5.Interpret DIBELS results. 6.Develop a plan to use DIBELS quarterly with all students. 7.Evaluate the current assessment system in your school.

150 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 150 Developing a Plan To Collect Schoolwide Data Areas Needing to be Considered When Developing A Plan: 1.Who will collect the data? 2.How long will it take? 3.How do we want to collect the data? 4.What materials does the school need? 5.How do I use the DIBELS Website? 6.How will the results be shared with the school? More details are available in the document entitled “Approaches and Considerations of Collecting Schoolwide Early Literacy and Reading Performance Data” in your supplemental materials

151 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 151 Who Will Collect the Data?  At the school-level, determine who will assist in collecting the data  Each school is unique in terms of the resources available for this purpose, but consider the following:  Teachers, Principals, educational assistants, Title 1 staff, Special Education staff, parent volunteers, practicum students, PE/Music Specialist Teachers  The role of teachers in data collection:  If they collect all the data, less time spent in teaching  If they collect no data, the results have little meaning

152 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 152 How Do We Want to Collect Data?  Common Approaches to Data Collection:  Team Approach  Class Approach  Combination of the Class and Team  Determining who will collect the data will impact the approach to the collection

153 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 153 Team Approach  Who? A core group of people will collect all the data  One or multiple day (e.g., afternoons)  Where Does it Take Place?  Team goes to the classroom  Classrooms go to the team (e.g., cafeteria, library)  Pros: Efficient way to collect and distribute results, limited instructional disruption  Cons: Need a team of people, place, materials, limited teacher involvement, scheduling of classrooms

154 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 154 Class Approach  Who? Teachers collect the data  Where Does it Take Place?  The classroom  Pros: Teachers receive immediate feedback on student performance  Cons: Data collection will occur over multiple days, time taken away form instruction, organization of materials

155 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 155 Combination of Team & Class Approaches  Who? Both teachers and a team  Where Does it Take Place?  Teachers collect the data  Team goes to the classroom  What Might it Look Like?  Kindergarten and First grade teachers collect their own data and a team collects 2 nd -3 rd grade  Pros: Increases teacher participation, data can be collected in a few days, limited instructional disruption  Cons: Need a team of people, place, materials, scheduling

156 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 156 How Long Will It Take? Kindergarten Time of Year / Measure(s) Approximate Time per Pupil Number of Data Collectors Pupils Assessed per 30 Minute Period Beginning ISF & LNF 4 min. 16-8 212-16 318-24 4-524-40 6-836-48 Middle ISF, LNF, PSF 6-7 min. 14-5 28-10 4-516-25 6-824-40 End ISF, LNF, PSF, & NWF 9 min. 13-4 26-8 4-512-20 6-818-32

157 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 157 How Long Will It Take? First Grade Time of Year / Measure(s)Time per PupilNumber of Data Collectors Pupils Assessed per 30 Minute Period Beginning LNF, PSF, & NWF 6-7 min. 14-5 28-10 4-516-25 6-824-40 Middle PSF, NWF, & ORF 8-9 min. 13-4 26-8 4-512-20 6-818-32 End of Year NWF & ORF 7 min. 14-5 28-10 312-15 4-516-25 6-824-40

158 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 158 How Long Will it Take? Second & Third Grade MeasureTime per Pupil Number of Collectors Pupils Assessed per 30 Minute Period ORF5 min. 16-7 212-14 318-21 4-524-35 6-836-56

159 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 159 What Materials Does the School Need?  DIBELS Materials  Benchmark booklets  Color coding  Labeling  Student stimulus materials  Binding, laminating, etc.  Other Materials  Stopwatches  Pencils, clipboards  Class rosters See document entitled “Approaches and Considerations of Collecting Schoolwide Early Literacy and Reading Performance Data” at website: http://dibels.uoregon.edu/logistics/data_collection.pdf

160 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 160 How Do I Use the DIBELS Website?  Entering and generating reports using the DIBELS website begins with setting up your school.  Sign up to get a user name and password at: http://dibels.uoregon.edu  Create your school in the system (a manual for using the website is available on the website as well as in your supplemental materials) Introduction Data System Measures Download Benchmarks Grade Level Logistics Sponsors Trainers FAQ Contact Information

161 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 161 Using the DIBELS Website Creating your school in DIBELS web: 1.Creating classrooms 2.Populating classrooms with students 3.Creating users Enter/Edit Data View/Create Reports Interpret Reports Administrative Menu Migrate Students System Status FAQ Manual Contact Information

162 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 162 Entering Data on DIBELS Website After your school has created the classrooms with students, you can enter the data you collected by selecting the classroom

163 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 163 Generating Reports  Two main types of reports generated from DIBELS Website:  PDF Reports: Downloadable reports designed for printing. The school and district PDF reports combine the most common reports into a single file.  Web Reports: Individual reports designed for quick online viewing. Select the specific report you would like. Enter/Edit Data View/Create Reports Interpret Reports Administrative Menu Migrate Students System Status FAQ Manual Contact Information

164 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 164 How Will the Results Be Shared With the School?  Schedule time soon after data collection to share and distribute results  School-level: Staff meeting  Grade-level: Team meetings  Determine a method of addressing concerns  Identifying at-risk students  Answering questions about the results  Re-thinking the data collection approach

165 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 165 Web Resources  Materials  Administration and scoring manual  All grade-level benchmark materials  Progress monitoring materials for each measure (PSF, NWF, ORF, etc.)  Website  Tutorial for training on each measure with video examples  Manual for using the DIBELS Web Data Entry website  Sample schoolwide reports and technical reports on the measures  Logistics  Tips and suggestions for collecting schoolwide data (see website)

166 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 166 Objectives: What You Will Learn and Do The objectives of today’s session are to: 1.Differentiate purposes of assessment. 2.Delineate how the DIBELS assessment system differs from traditional assessment systems. 3.Use DIBELS to evaluate outcomes at the school, grade, class and student level. 4.Administer and score DIBELS. 5.Interpret DIBELS results. 6.Develop a plan to use DIBELS quarterly with all students. 7.Evaluate the current assessment system in your school.

167 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 167 Planning & Evaluation Tool (PET)  As school teams, you will work together on the Planning and Evaluation Tool (Simmons & Kame’enui, 2000)  The second section focuses on Assessment.  Complete this section based on the information presented in today’s session and your knowledge of your school’s current assessment practices.

168 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 168 Day 2: PET Time  Complete Element 2 of the Planning & Evaluation Tool: Assessment.  Review each item.  Determine whether you will have individuals complete items independently or as a group (e.g, Grade level teams: All K teachers complete 1 PET, all Grade 1 teachers complete a separate PET).  Report the score for each item and document the information sources available to substantiate the score reported.  Allow approximately 15-30 minutes for completion.

169 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 169 Day 2: PET Time

170 Good, Harn, Kame'enui, Simmons, & Coyne © 2003 170 Reflections and Reports  After schools complete Element II, review items individually and ask schools to volunteer their current status with respect to Assessment.  Ask schools to identify particular items in which they scored full points and ones in which there is room for improvement.  This information will be used to formulate a school- specific Reading Action Plan (RAP) on Day 4 of the IBR.


Download ppt "Institute on Beginning Reading Day 2: Evaluating Performance: Schoolwide Assessment of Student Performance."

Similar presentations


Ads by Google