Presentation is loading. Please wait.

Presentation is loading. Please wait.

Reading Assessment: Getting Started Weber School District September 3 & 4, 2009.

Similar presentations


Presentation on theme: "Reading Assessment: Getting Started Weber School District September 3 & 4, 2009."— Presentation transcript:

1 Reading Assessment: Getting Started Weber School District September 3 & 4, 2009

2 General Assessment Knowledge  Types of Tests –Unit of Administration (individual vs. group) –Purposes for Assessment  Screening  Progress Monitoring  Diagnosis  Outcome Reporting –Design (norm-referenced vs. criterion)

3 Understanding Outcomes  Norm-Referenced Testing –Percentile rank –Quartiles –Standard Deviations –Grade-level Equivalent –Normal Curve Equivalent –Stanine

4 Understanding Outcomes  Norm-Referenced Testing

5 Understanding Outcomes  How should we interpret any score? –Actual Score + –Error = –True Score  How should we interpret these scores? –Scaled Scores ? –GLE ?

6 What makes assessment good?  Reliability –Stability –Internal Consistency –Inter-rater –Alternate Forms

7 What makes assessment good?  Validity –Construct –Content –Predictive –Concurrent –Consequential

8 When does assessment not qualify as good assessment?  When is assessment not designed and tested to assure reliability and validity to some accepted degree? Examples of assessments that do not meet these psychometric standards: –Kid Watching –Portfolios –Anecdotal Records

9 Principles that should guide a comprehensive assessment plan Principle 1:Classroom assessment should first and foremost inform and improve teaching. Principle 2:Assessment procedures should help teachers discover what children can and cannot do. Principle 3: Every assessment should be selected with a specific purpose in mind.

10 Principles that should guide a comprehensive assessment plan Principle 4:Classroom assessment should be linked to accountability standards. Principle 5: Classroom assessments should allow teachers to identify each child’s zone of proximal development in reading. Principle 6: Classroom assessments need to be reliable, valid, and efficient preserving as much time as is possible for teaching and learning.

11 Accountability & Assessment

12 Accountability is the lynchpin of reform. One cannot know what to teach to whom without assessment data. Accountability

13 Comprehensive Reading Assessment Problem: Unreliable and untested assessments can actually misinform instructional decisions. Solution: Use reliable and valid assessment tools and procedures for differing assessment purposes.

14 Comprehensive Reading Assessment: Federal Four Purposes Outcome - Provides a bottom-line evaluation of the effectiveness of the reading program in relation to established performance levels. Outcome - Provides a bottom-line evaluation of the effectiveness of the reading program in relation to established performance levels. Screening - Designed as a first step in identifying children who may be at high risk for delayed development or academic failure and in need of further diagnosis of their need for special services or additional reading instruction. Screening - Designed as a first step in identifying children who may be at high risk for delayed development or academic failure and in need of further diagnosis of their need for special services or additional reading instruction.

15 Comprehensive Reading Assessment: Federal Four Purposes Diagnosis - Helps teachers plan instruction by providing in-depth information about students’ skills and instructional needs. Diagnosis - Helps teachers plan instruction by providing in-depth information about students’ skills and instructional needs. Progress Monitoring - Determines through frequent measurement if students are making adequate progress or need more intervention to achieve grade-level reading outcomes. Progress Monitoring - Determines through frequent measurement if students are making adequate progress or need more intervention to achieve grade-level reading outcomes.

16 Purpose:To determine children who are likely to require additional instructional support to succeed (predictive validity). Purpose: To determine children who are likely to require additional instructional support to succeed (predictive validity). When:Early in the academic year or when new students enter school. When: Early in the academic year or when new students enter school. Who:All students. Who: All students. Relation to instruction:Most valuable when used to identify children who may need further assessment or additional instructional support. Relation to instruction: Most valuable when used to identify children who may need further assessment or additional instructional support. Screening Assessment

17 Progress-Monitoring Assessment Purpose:Frequent, timely measures to determine whether students are learning critical skills, concepts, and strategies. Purpose: Frequent, timely measures to determine whether students are learning critical skills, concepts, and strategies. When:At minimum three times per year at critical decision making points. When: At minimum three times per year at critical decision making points. Who:All students. Who: All students. Relation to instruction:Indicates students who require additional assessment and timely intervention. Relation to instruction: Indicates students who require additional assessment and timely intervention.

18 Diagnostic Assessment Purpose:To provide specific information on skills and strategy needs of individual students. Purpose: To provide specific information on skills and strategy needs of individual students. When:Following screening or at points during the year when students are not making adequate progress. When: Following screening or at points during the year when students are not making adequate progress. Who:Selected students as indicated by screening or progress monitoring measures or teacher judgment. Who: Selected students as indicated by screening or progress monitoring measures or teacher judgment. Relation to instruction:Provided specific information on target skills; highly relevant. Relation to instruction: Provided specific information on target skills; highly relevant.

19 Outcome Assessment Purpose:To determine level of proficiency in relation to a norm reference population or a criterion. Purpose: To determine level of proficiency in relation to a norm reference population or a criterion. When:Typically administered at end of year. Can be administered pre/post to assess overall growth. When: Typically administered at end of year. Can be administered pre/post to assess overall growth. Who:All students. Who: All students. Relation to instruction:Provides index of overall efficacy but limited timely information for instructional decision making. Relation to instruction: Provides index of overall efficacy but limited timely information for instructional decision making.

20 Classroom Assessment: Screening  Informal Phonics Survey  DIBELS  Graded Word Lists  Informal Reading Inventories  Observation Survey  Leveled Books/Running Records

21 Classroom Assessment: Screening  DIBELS  Fox in a Box  Phonological Awareness Screening Test

22 Classroom Assessment: Progress Monitoring  Informal Reading Inventories  DIBELS  The Observation Survey

23 Out of Class Assessment: Diagnostic  Woodcock Reading Mastery Test  Texas Primary Reading Inventory  Basic Early Reading Assessment  CORE Phonics Assessment  Woodcock-Johnson III Tests of Achievement  Kaufman Survey of Early Academic and Language Skills  Test of Early Reading Ability

24 Out of Class Assessment: Outcomes  Stanford 10  Stanford Reading First  Iowa Test of Basic Skills  Gates MacGinitie Reading Test

25 A Component-Based Assessment Model  Phonological Processing (words, syllables, rhyming, phoneme counting, blending, segmenting, manipulation)  Rapid Naming Ability  Oral Language (Listening Comprehension, Single word vocabulary, sentence completion,story retelling)  Alphabet knowledge (Letter-name knowledge, letter-name fluency, letter name-sound knowledge, letter sound fluency) Rathvon, N. (2004). Early reading assessment: A practioner’s handbook. New York: Guilford Press.

26 A Component-Based Assessment Model  Concepts about print (book handling, directionality, print not picture, etc.)  Word reading fluency (real words, nonsense words)  Contextual reading fluency  Reading Vocabulary  Comprehension  Writing  Motivation Rathvon, N. (2004). Early reading assessment: A practioner’s handbook. New York: Guilford Press.

27 Reading Assessment: Drill Down Diagnosis Model Oral Language Rapid Naming Ability Concepts about Print Letter Name Knowledge Phonemic Awarenes s Phonics and Spelling Word Reading Fluent Reading in Context Vocabulary Comprehension Strategy Selection and Use Constructing Meaning

28 Assessing Rapid Naming  Test of Automatized Rapid Naming – Colors, numbers, pictures

29 Assessing Oral Language  DIBELS - Word Use Fluency  Picture Naming Test  PPVT/EVT  Fox in a Box  Wechsler Individual Achievement Test - II

30 Assessing Concepts About Print  Concepts About Print Test  Fox in a Box  PALS  TERA  TPRI

31 Assessing Alphabet Knowledge  DIBELS – Letter Naming Fluency  Observation Survey – Letter Knowledge  Fox in a Box – Alphabet Recognition and Alphabet Writing  PALS – recognition, sounds  TERA  TPRI  WAIT II

32 Assessing Phonemic Awareness  DIBELS – Phonemic Segmentation Test  Yopp Singer Test of Phonemic Segmentation  Fox in a Box  Phonological Awareness Literacy Screening (PALS)  Phonological Awareness Test (PAT)

33 Assessing Phonics  CORE (Consortium on Reading Excellence) Phonics Survey

34 Assessing Writing - Spelling  Qualitative Spelling Inventory  Fox in a Box  PALS  PAT  TPRI  WAIT II  Morris-McCall Spelling List

35 Assessing Fluency  Word & Nonsense Word Reading –Graded Word Lists – Fry, Dolch, Zeno, San Diego Quick Assessment, etc. –DIBELS – Nonsense Word Fluency –Observation Survey – Word Test –Fox in a Box –PAT –Test of Word Reading Efficiency

36 Assessing Fluency  Contextual Reading Accuracy –DIBELS – Oral Reading Fluency –Observation Survey – Running Records –Fox in a Box –PALS –TPRI –WAIT II

37 Assessing Fluency  Contextual Reading Fluency –DIBELS – Oral Reading Fluency –Observation Survey – Running Records –Fox in a Box –Gray Oral Reading Test –PALS –TPRI –WAIT II

38 Assessing Vocabulary  Local Assessment  ITBS  Stanford 10  Stanford Reading First

39 Assessing Comprehension  Skills  Strategies  Content  Memory  Transfer

40 Assessing Writing – Expression  Fox in a Box  WAIT II

41 Assessing Motivation  Motivation for Reading Questionnaire, Revised Version  Reader/Writer Self Perception Scales  Elementary Reading Attitude Survey  Interest Inventories

42 Assessing Strategic Knowledge  Burke Reading Interview  Meta-comprehension Strategy Index  Reading Strategy Use Scale  Background Knowledge Assessment Procedure

43 Connecting Assessment to Instruction  Discrepancy/Deficit Model  Tiered Instructional Model  RTI Model (Response to Intervention)  Differentiated Instruction  Targeted Intervention

44 Evaluating Your Plan of Assessment  Are you getting the information you need in your system, school, or classroom to plan and provide effective instruction and other support?  Do you have individuals trained to provide assessment support?  Have you designed an assessment plan and schedule?

45 D. Ray Reutzel, Ph.D. Emma Eccles Jones Endowed Chair Professor Utah State University www.coe.usu.edu/ecc Presentations Button Left Hand Side or IRA Board of Directors International Reading Association rreutzel@reading.org If you would like a copy of this power point If you would like a copy of this power point:


Download ppt "Reading Assessment: Getting Started Weber School District September 3 & 4, 2009."

Similar presentations


Ads by Google