Presentation is loading. Please wait.

Presentation is loading. Please wait.

Indian Prairie District 204

Similar presentations


Presentation on theme: "Indian Prairie District 204"— Presentation transcript:

1 Indian Prairie District 204
Advanced Problem Analysis in Reading: Curriculum Based Evaluation & other functional academic assessments             Indian Prairie District 204 Problem Solving October, 2008

2 Acknowledgements Kerry Bollman: NASP & Flex West CBE presentations
Sue Gallagher: Flex West CBE presentations Madi Phillips: NSSED presentations Heartland AEA 11, Des Moines, Iowa Ken Howell & Victor Nolet : CBE book Joe Witt: STEEP Model/1 minute Academic Assessment Ed Shapiro: Academic Skills Problems Ed Daly: Functional Analysis of Academics

3 Objectives State the fundamental components of functional academic assessment, including CBE Be introduced to some of the basic skills involved in CBE and advanced problem analysis in reading Practice the thought process through training exercises

4 Agenda Overview of CBE as thought process 12:00-12:45
How is this similar to or different from your previous thinking related to CBE? Jigsaw Activity :45-1:45 In letter group – read sections, create summary with visuals In number group – jigsaw through each section, teaching the other members of your group The Flowcharts 1:45-2:30 Examples Activity 2:15-3:30

5 Taking a temperature Medically, temperature general indicator of health Academically, CBM as GOM

6 Sometimes you need more information . . .
Some Tier 2 Most Tier 3 More in-depth Problem Analysis

7 The Problem Solving Process
• Define the Problem What is the problem and why is it happening? • Evaluate • Develop a Plan What are we going to do? Did our plan work? • Implement Plan Carry out the intervention

8 Curriculum Based Evaluation
A process of evaluation and decision making that may use CBM or other derivations of CBA with the goal of maximizing student learning. Core components – comparison, judgment, and problem-solving - not measurement Decision-making framework for thinking A network of curriculum-driven if/then precepts Howell, Hosp, & Kurns (2008), BPV Chapter 20.

9 CBE and CBM: How they work together within problem solving
The Problem expected actual CBM Why? CBE Problem Analysis CBM Monitor Why? Intervention

10 CBE Within a Problem Solving Process
What is the problem and why is it happening?

11 Assessment Guidelines
Must be aligned with the curriculum Must be easy to use Must have clearly defined purposes Should be standardized Should sample clearly defined content domains Should assess relevant types of knowledge Some should collect rate data Should collect an adequate sample Should use appropriate scoring rules Some should be complex and interactive Howell & Nolet (2000) p 148

12 Procedures for Assessing Academic Skills
Structured Teacher Interview & Rating Scales Direct Classroom Observation Student Interview Permanent Product Review Curriculum-Based Assessment of Academic Skills Ed Shapiro (2004)

13 Can’t Do vs. Won’t Do Obtain 3 previously completed assignments. Each
should be one which the student has performed much below expectations. Present first assignment (answers removed) with incentive. If increases score by 25% or scores 80% or above, then move to next step. Have student choose reinforcers (teacher approved) that he/she would like to work for in the future. Test by presenting another assignment with a reinforcer to the student. Evaluate outcomes. If student markedly increases performance when offered incentives, likely WON’T DO. 6. Create incentive plan. Joe Witt & Ray Beck, 2001 Ed Daly, 1999

14 Adding in Can’t Do vs. Won’t Do
Step 1: Can’t Do Assess thru Review Low Performance? No? Stop Yes? Step 2: Won’t Do Interview Reassess Motivator Performance Improved? Step 3: CIE Assess Curriculum, Instruction, Environment Step 4: SLA’s Survey Level & Specific Level Assessments Needs? Provide appropriate intervention CBE C.Martin, 2005

15 The CBE Process of Inquiry
Can you define the problem? Yes No No Select & conduct screening assessments Step 1: Fact-Finding & Problem Validation Step 2: Develop Assumed Causes Step 3: Validating Step 4: Summative Decision Making Step 5: Formative Decision Making Problem Identification Summarize results Can you validate the problem? Hypothesis Can you plan instruction? Yes Set the goal(s) Test Hypothesis No Generate assumed causes Yes Plan & conduct assessments No Hypothesis true? Design Instruction Summarize results Were assumed causes validated? Yes Progress Monitor & Plan Evaluation Design & implement instruction Make formative decisions Progress monitor Howell, Hosp, & Kurns (2008), BPV Chapter 20

16 Appropriate Development of Assumed Cause
Fact Assumed Cause Test Result Difference between what student is doing and what’s expected What we think the reason for the problem might be How we will confirm the assumed cause Was the assumed cause confirmed? F AC T R

17 Rules for Developing Assumed Causes
Clarify the purpose If entitlement, then student-to-group comparison needed If what to teach, then current performance to expected performance comparison needed If how to teach, then formative data needed to determine effectiveness Target relevant information Alterable variables Essential characteristics Think about instruction Information/data can’t focus exclusively on student ICEL Formative Howell, Hosp, & Kurns (2008), BPV Chapter 20

18 Rules for Developing Assumed Causes
Think about the Curriculum Skill sequences Proficiency Response type Conditions Think about Types of Knowledge & Beyond Knowing How When, why, & under what circumstances skill should be used Check the Student’s Awareness of Skills Self-monitoring, self-control, metacognition Ask to rate task difficulty before doing TEST DOWN/TEACH UP Expected level first Then work backward Howell, Hosp, & Kurns (2008), BPV Chapter 20

19 Rules for Developing Assumed Causes
Pick the Most Likely Targets First The most likely explanation for the student’s lack of proficiency with a skill, or the most likely solution for the problem, should be checked before going on to those that are more complex or exotic. Howell, Hosp, & Kurns (2008), BPV Chapter 20

20 READING Early Reading Advanced Reading

21 K.Bollman, 2006

22 Review R-CBM At Grade Level
Easy Button Fluency + Fluency - Maze Reread Table Tap Survey Level Assessment Accuracy + Accuracy -

23 Activity Beginning Reading - “Learning to Read”
Group 1: Read “Phonological Awareness” pp & “Phoneme Segmentation Fluency” p.385; Review curriculum maps Group 2: Read “Alphabetic Principle” pp & “Letter Sound Fluency” & “Nonsense Word Fluency” p. 385; Review curriculum maps Group 3: Read “Accuracy & Fluency” pp.381 & “Word Identification Fluency” & “Oral Reading Fluency” pp ; Review curriculum maps Advanced Reading - “Reading to Learn” Group 1: Read “Content of the Reading-Decoding Strand”, “Content of the Prior/Background Knowledge Strand” pp & “Cloze and Maze”, “ORF” pp Group 2: Read “Content of the Vocabulary Strand” pp , “Review of Text-Dependent Grades and Assignments” pp , “Vocabulary Matching” p.406 Group 3: Read “Content of the Comprehension Strategy Strand” pp , Written and Oral Retell Measures, Think-Aloud Interview pp

24 Early Reading Flowchart
Are reading skills acceptable? R – Curriculum, Permanent Products I – Teacher O – Student while reading T – Using CBM Are oral reading skills acceptable? Go to Comprehension yes no Missing early literacy skills? Survey Early Literacy Skills yes K-2 or older student who decodes few words? Build Fluency w/ rereading Evaluate phonics no Is oral reading accurate but slow? Yes, Phonics patterns Do Pencil Tap no Do Rereading assmt Are there patterns? Yes, whole word no yes yes Provide Balanced Instruction Build Self Monitoring Did accuracy improve? Did rate increase? Correct Patterns Categorize errors no yes More errors on harder passages? no Do Error Sample & Analysis no

25 Primary Measures for Early Reading
Phoneme Segmentation Fluency Letter Sound Fluency Nonsense Word Fluency Word Identification Fluency R-CBM (ORF)

26 Basic Reading & Comprehension

27 Early Reading Risk Indicators
Critical Reading Element High-priority skill Assessmt Benchmark Time of Benchmark Risk indicator in fall Phonemic awareness Phoneme segmenting PSF 35 cppm Spring of K <10 Alphabetic principle Letter sounds Letter sounds/decoding LSF NWF 40 clspm Win of 1st <30 Accuracy & Fluency Sight word reading WIF 60 cwpm Spring of 1st <15

28

29 Survey Level Assessment
- / - Basic Reading Survey Level Assessment Testing down grade levels using R-CBM until student is at or above 25th%-ile

30 - / - Early Reading Decision Point
Reading at least 40 wrc in 1st grade material? YES Provide instruction at instructional level with emphasis on phonics & fluency NO Provide intensive phonics and phonemic awareness instruction

31 Intervention suggestions
- / - Intervention suggestions Direct Instruction Corrective Reading Horizons Reading Mastery REWARDS Read Well

32 Early Literacy Skills Phonemic awareness Blending Segmenting
Manipulation Identifying Sounds Rhyming Concepts of print Page conventions Word/sentence/book length & boundaries Environmental print/logos

33 Phonological Awareness Assessment
Phoneme blending & segmentation Onset rime segmentation & blending More complex Syllable segmentation & blending Sentence segmentation Rhyming & song Less complex

34 Intervention suggestions
Earobics Sounds & Letters K-PALS Great Leaps Road to the Code Scott Foresman Early Reading Intervention Lindamood Bell LiPS Program

35 Checking for Decoding Skill…
- / - Checking for Decoding Skill… Have the student read a grade level passage aloud Write down each incorrectly read word on a piece of paper Have the student attempt to read each incorrectly read word in isolation from your paper Can the student correctly decode words in isolation?

36 Analyzing Errors in Reading
- / - Analyzing Errors in Reading Select a passage you estimate that the student will read with about 80-85% accuracy. Remember: 80% accuracy = 1 error every 5 words! Try different levels of passage until you find the right fit You will need at least 50 errors for kids grades 2 & above (25 errors for grade 1) Passage will need to be 250 words or more

37 - / - Pattern of Error Types
Compare each error in the passage with the Error Pattern Checklist Make a mark next to the category in which the error seems to fit Come up with a total of all errors Identify the categories in which most errors occur

38 slow rate / adequate accuracy
Basic Reading - / + slow rate / adequate accuracy Re-read strategy Student reads for 2-minutes, note # wrc at end of the 1st minute. Say, NOW READ AGAIN AS QUICKLY AND ACCURATELY AS YOU CAN. Student reads for 1-minute, determine wrc. Compare 1st read to 2nd read scores

39 - / + Basic Reading Decision Point Rate improved by approximately 25%?
YES Use a fluency building Intervention (re-reading) NO Recheck phonics Needs and Can’t Do/Won’t Do

40 Intervention suggestions
- / + Intervention suggestions Re-reading techniques Soar to Success Read Naturally PALS Great Leaps Six Minute Solution Quick Reads Choral reading Cloze reading

41 Adequate fluency / Poor accuracy
Basic Reading + / - Adequate fluency / Poor accuracy Pencil Tap Using passage where student is approximately 85% accurate, tell student to try and fix the word every time you tap the table Count the number of self-corrections student makes Compare to total number of errors

42 Determine if student has skills to correct errors using the pencil tap test (assisted monitoring) “Whenever you make an error, I’m going to tap the table with my pen. When I tap the table, I want you to fix the error.” If student can fix errors when you point them out, you know he/she has the decoding skills to read the passage, but needs assistance learning to self-monitor for accuracy. Intervene with strategies for self-monitoring decoding. If the student cannot fix errors when you point them out, a skill deficit in decoding may be indicated. Further analyze errors to isolate patterns of difficulty, and intervene with targeted decoding strategies.

43 + / - Basic Reading Decision Point Self-corrected majority of errors?
YES Use self-monitoring intervention NO Reassess Can’t Do/ Won’t Do

44 + / - Interventions Design an intervention to increase attention
to accuracy. Does this make sense? Does it match what is on the page? Reinforcement for accuracy Goal setting & progress monitoring If did not make more errors on more difficult passages, use intensified balanced instruction. If did make more errors, categorize the errors, look for patterns, and correct.

45 + / - Interventions (cont.) Spelling through Morphographs
Word sorting/word study Great Leaps REWARDS Making sense of phonics Phonics and Spelling through Grapheme Mapping Soundabet

46

47 Primary Measures for Advanced Reading
Survey-Level Purpose Specific-Level Purpose Domains Sampled Expository Maze & Close Qtrly benchmark measure Screening Check decoding Check background knowledge Text leveling & selection Assessing vocab knowl Assessing lang skills relative to text demands Comprehension Vocabulary Decoding Syntax R-CBM (ORF) Initial indication of understanding To rule out decoding problems Review of grades & feedback on text-dependent tests & classes Pattern recognition by text type or assignment type To discriminate global from subject-specific problems Application of advanced reading Metacognitive strategy use Vocabulary matching Check vocabulary accuracy Find level of academic words Find level of subject-specific words Assessing vocab knowledge Progress monitoring Retell Check embedded strategy knowledge Look for pattern in attention to sample use of terminology Prior background knowledge Metacognitive strategy Think-aloud sessions Check metacognitive knowledge Strategy evaluation Evaluation of metacognitive content Task-specific strategy

48 Vocabulary Matching

49 Research Behind the Vocabulary-Matching Measures
Christine Espin – University of Minnesota Our research team at the University of Minnesota has conducted a series of studies to examine the reliability and validity of vocabulary matching as an indicator of content-area learning. The results of this research show that the vocabulary-matching measure is a valid and reliable indicator of performance and progress in social studies and science. Performance on the vocabulary-matching measure is related to performance on other content-area tasks, including research-made content tests, content-area subtests of standardized achievement tests, and teacher-made content measures. In addition, students who grow more on the vocabulary-matching measures score higher on criterion measures of content-area performance. As an aside, our research also shows that students must read the measures themselves (as opposed to having the measures read to them by the examiner) to obtain reliable and valid growth rates.

50 Creating Vocabulary-Matching Probes
Create a pool of potential vocabulary terms. Develop a pool of important vocabulary terms from the content to be covered over the entire school year (or semester if the class is offered on a semester basis.) Terms can be selected from the classroom textbook, from teacher notes and lectures, or from both sources. Selected terms should be germane to the content being covered. If the textbook is fairly representative of the content being covered, the terms can be created from the glossary of the textbook or from terms in the text that are highlighted or italicized. Develop definitions for each term. For each term, develop a short definition. The easiest method for developing definitions is to use the glossary of the textbook. Other methods are to rely on teachers' notes and lectures or to use a school-based dictionary. Limit the length of each definition to approximately 15 words. Make them clear and unambiguous. Create weekly measures that are similar. For each measure, randomly select 20 terms and definitions from the pool created in steps 1 and 2. In addition, select two definitions that do not match any of the terms. Thus, each probe will have 20 terms and 22 definitions. One practical way to develop the measures is to write each vocabulary term on the front of an index card with its definition on the back. For each measure, shuffle all of the cards, and randomly select terms and definitions. Place the terms on the left-hand side of the page and the definitions in random order on the right-hand side. Number the terms, leaving a blank space by each term; put letters by each definition. The students write the letter for the correct definition in the blank next to each term.

51 Excerpts from example of teacher selected word lists
Grade Level: 6th Department: Language Arts Word Definition Ambiguous Having two or more meanings Archenemy A chief rival Benevolence An inclination to do good Biopic A film dramatizing the life of a famous person Cyclical Occurring in cycles Deviate To turn away from a course, path, or topic Digress To stray from the subject in speaking or writing Aqueduct A bridge-like structure that carries water Department: Math Word Definition Circumference Measure of the distance around a circle Transversal A line that crosses two or more other lines Decagon A closed figure with ten sides and ten angles Perimeter The measure of the distance around a figure Diameter A line segment that divides a circle into two equal parts

52 Academic Vocabulary

53 Probe Generator Template

54 Probe Generator Template
11. Go to: DataSort. Sort by ‘(2)Leave Blank! and make sure Ascending his clicked. Hit ok. 12. Now your definitions are in random order. Don’t worry, the correct word has been moved with it.

55 Retell Reader retelling profile Student reads a complete passage
at instructional level Student then orally or in writing retells the passage content Examiner uses matrix for scoring the components retold

56 Metacognitive Status Sheet
Think Aloud Metacognitive Status Sheet

57 Comprehension Status Sheet
Information gathering tool to assist with identifying causes of comprehension concern Anyone with direct knowledge of the student’s skills may be invited to a meeting to discuss this status sheet Purpose of the meeting is to limit the field of inquiry by ruling out what we already know about the student’s skills In interview/discussion format, go through each of the primary categories and mark the appropriate status (pass, unsure, no pass) Indicators listed below each category exist only to help define the categories You can still mark pass for a category if not all indicators accurately describe the student; you can still mark no pass for a category even if some indicators accurately describe the student

58 Metacognitive Strategies Status Sheet

59 Status Sheet Interpretation
Reading Comprehension Status Sheet Interpretation Assume that those categories marked Pass are skill areas in which problems do not exist. Monitoring for maintenance and generalization may be done, but no other action is indicated Categories marked No Pass are those where interventions should be developed Categories marked Unsure are those where additional information is required to evaluate the student’s skill - proceed to CBE specific level procedures

60 Specific Level Assessments
Monitoring Meaning Awareness of Reading Retell Prior Knowledge Vocabulary Knowledge of Text Structure & Grammar

61 Analyzing Errors in Meaning
Have the student read the passage out loud to you Keep careful notes of all errors made & exactly what the student said It may be helpful to tape record so you can go back and fill in your notes.

62 Analyzing Errors in Reading
Three types of analyses What % of errors are meaning violating What pattern of reading error types are made What pattern of decoding content errors are made

63 % Meaning Violating Errors
Review existing data to determine if errors violate meaning Example text: “They are such smiling happy girls.” Meaning preserving error: “They are such smiley happy girls.” Meaning violating error: “They are such smelling happy girls.”

64 % Meaning Violating Errors
Tally each error as: Meaning Violating Meaning Preserving Not Sure Circle the tally mark if the student self corrected Use pencil tap to determine self-corrects

65 Intervention suggestions
Pencil tap as intervention Repeated readings Students ask Does this make sense? Does it fit with what I have been reading? Does it fit with what is on the page?

66 Awareness of Reading Process
Metacomprehension Strategy Index Student completes index Compare answers to the key

67 Intervention suggestions
Before, during, and after strategies Teach reading as active process

68 Prior Knowledge Collect 4 reading maze or cloze passages.
Administer first two using standard format Average the scores Administer second two after discussing the topic (do not teach the content – just try to prime student’s recall about the topic) Average these scores If student’s scores improve by at least 50% or if meet criterion (85% maze; 50% cloze), then prior knowledge is likely impacting comprehension

69 Intervention suggestions
Teach active and reflective reading Pre-reading questioning Anticipation guides

70 Vocabulary Ask student to identify important vocabulary words
prior to reading passage Test the student’s knowledge of academic vocabulary

71 Intervention suggestions
Soar to Success Be careful of decontextualized programs Not dictionaries Not decodable books Start with basic words, then high frequency words, then low frequency

72 Knowledge of Text Structure/Grammar
Check oral language skills Does student accurately make predictions? Does he/she know what words such as he, she, or their refers to? Has student been exposed to a variety of text structures?

73 Intervention suggestions
Highlight important features of text Mark up passages to show what words pronouns refer to Language interventions Provide exposure to a variety of text structures

74 Reading Examples/Practice

75 Carly - Kindergarten Review
Carly did not recognize any letters at her Fall Kindergarten screening Interview Teacher notes that Carly’s class has been working on various literacy activities with one new letter per week for 30 minutes per day since the beginning of the school year Observe Test Winter Kindergarten screening update Carly does not recognize any letters

76 Henry, Grade 1 (R) Previous records:
Kindergarten screening results appeared normal Teacher comments on report card indicate difficulty letter identification (end of year had 16 letters mastered), but note good rhyming skills (I) Teacher: Henry currently receives 30 minutes/day systematic phonics instruction based on word families (O) Henry in class: On task time during teacher led instruction and independent seat work is commensurate with peers (T) Fall DIBELS Scores LNF = 18, PSF = 48, NWF = 1 Criterion for Fall = PSF > 35, Winter = NWF > 50 50th percentile scores: LNF = 42, PSF = 36, NWF = 27

77 Bart, Grade 2 Review No previous reading services, no teacher comments
on past report cards regarding reading. Interview Parents haven’t noticed a concern, but note that Bart “never chooses” to read at home Teacher states that Bart successfully uses decoding strategies Observe Bart during a round robin reading activity in class. Bart self corrects all but 1 error across 3 paragraphs he reads. Test Fall Reading CBM scores 48 WRC, 1 error 50th percentile = 77 WRC, 2 errors

78 Rachel, Grade 3 Review Old DIBELS data: never met criteria for Nonsense Word Fluency Report card comments “star reader” in grade 1 Has received speech therapy since 3 years old Interview 2nd grade teacher noted that Rachel likes to be a “good fast reader.” Observe Test Fall CBM scores 107 WRC, 11 errors 50th percentile = 102, 3 errors

79 Ellie, Grade 4 Review Report card history indicates difficulty in reading Participated in “high flyers” reading support group in 1st and 2nd grades Low scores on current reading comprehension class work Interview Ellie reports she is an “ok reader”, but doesn’t remember what she reads Observe Test Fall Reading CBM scores: Grade 4 text – 61 WRC, 10 errors 50th percentile scores = 131 WRC, 3 errors

80 Jack, Grade 6 Review Past testing indicates adequate listening
comprehension skills Interview Teacher reports that Jack does not seem to remember anything that he reads Observe When Jack reads in class, it sounds very mechanical and unnatural Test Fall CBM scores 160 WRC, 4 errors 50th percentile = 142, 1 error

81 Jeff, Grade 7 Review Good attendance, has attended school in same district since 2nd grade, average to below average grades Interview Teacher says that he reads at 3rd grade level, difficulties in all areas but listening comprehension is fine, Jeff says he likes reading silently better than reading aloud Observe Jeff got no time to read with feedback during a language arts class Test Fall CBM scores 58 WRC, 7 errors 50th percentile = 136 wrc


Download ppt "Indian Prairie District 204"

Similar presentations


Ads by Google