Presentation is loading. Please wait.

Presentation is loading. Please wait.

4Sight Benchmark Assessments

Similar presentations


Presentation on theme: "4Sight Benchmark Assessments"— Presentation transcript:

1 4Sight Benchmark Assessments
A Data Tool in Pennsylvania’s Standards Aligned System Summer 2008 © 2006 Success for All Foundation ZZ4146

2 Initial Training Objectives
Understand the design and purpose of 4Sight Reading and Math Benchmark Assessments as part of the Standards-Aligned Systems. Facilitate the administration of 4Sight Reading and Math Benchmark Assessments. Facilitate the scoring of 4Sight Reading and Math Benchmark Assessments. Begin to develop plans to prepare staff to administer, score, and utilize 4Sight. Generate and read proficiency and subscale reports in the Member Center. © 2006 Success for All Foundation

3 Pennsylvania’s Standards-Aligned System
April 2008

4 Assessment Activity Brainstorm and list all of the assessments you administer (district, building, grade level, classroom). © 2006 Success for All Foundation

5 Fundamental to ensuring usefulness of an assessment is:
Understanding the purpose for which the assessment was developed; Understanding the purpose for which it is being administered; and Understanding how the data is used and fits in school structures. © 2006 Success for All Foundation

6 Types of Assessment in Standards Aligned System
The assessment was developed as: Summative/Outcome: “Reaching Our Goals” Formative/Progress Monitoring: “Growth Charts” Diagnostics: “In-depth View” Benchmark: “Progress Towards Standards” © 2006 Success for All Foundation

7 Summative/Outcome Purpose An integral part of an assessment plan
Reports how a student compares to a certain standard State or local accountability Helps to draw conclusions regarding groups of students Cannot inform current instructional practices An integral part of an assessment plan Occurs at end of school level, grade, or course or at certain grades Provides outcome data Cannot directly inform instructional practices because administered after instruction – no chance to respond to the data with that specific group of students. Summative Assessment: seeks to make an overall judgment of progress made at the end of a defined period of instruction. They occur at the end of a school level, grade, or coursed, or are administered at certain grades for purposes of state or local accountability. These are considered high-stakes assessments and the results are often used in conjunction with No Child Left Behind (NCLB) and Adequate Yearly Progress (AYP). They are designed to produce clear data on the student’s accomplishments at key points in his or her academic career. Scores on these assessments usually become part of the student’s permanent record and are statements as to whether or not the student has fallen short of, met, or exceeded the expected standards. Whereas the results of formative assessments are primarily of interest to students and the teachers, the results of summative assessments are also of great interest to parents, the faculty as a whole, the central administration, the press and the public at large. It is the data from summative assessments on which public accountability systems are based. If the results of these assessments are reported with reference to standards and individual students, they can be used as diagnostic tools by teachers to plan instruction and guide the leadership team in developing strategies that help improve student achievement. Examples of summative assessment are: PSSA and Terra Nova School Talk Video: Data Driven Instructional Practices School Improvement Process Tools: Pennsylvania Adequate Yearly Progress Getting Results! (Gen. 4) Workbook Pennsylvania Value-Added Assessment System (PVAAS) - Evaluating Growth, Projecting Performance E-metric-PSSA Data Interaction - create your own reports in tables, graphs or external files, at the summary or individual student level, by selecting content, statistics, aggregation levels, disaggregated groups or subgroups, and/or score variables. © 2006 Success for All Foundation

8 Summative/Outcome Measures (continued)
Examples PSSA Unit exams Final exams End of year grades Terra Nova © 2006 Success for All Foundation

9 Conducted at Student’s Instructional Level
Formative/Progress Monitoring Conducted at Student’s Instructional Level Purpose Provides data regarding impact of instruction on learning Data used to adapt instructional practices to meet individual student need An integral part of an assessment plan Given frequently to determine student growth Classroom based assessments May range from formal instruments to informal observations Results used to shape teaching and student learning Black and Wiliam 1998 Formative Assessment: in Pennsylvania we are defining formative assessment as classroom based assessment that allows teachers to monitor and adjust their instructional practice in order to meet the individual needs of their students. Formative assessment can consist of formal instruments or informal observations. The key is how the results are used. Results should be used to shape teaching and learning. Black and Wiliam (1998) define formative assessment broadly to include instructional formats that teachers utilize in order to get information that when used diagnostically alter instructional practices and have a direct impact student learning and achievement. Under this definition, formative assessment encompasses questioning strategies, active engagement check-ins, (such as response cards, white boards, random selection, think-pair-share, popsicle sticks for open-ended questions, and numbered heads) and analysis of student work based on set rubrics and standards including homework and tests. Assessments are formative when the information is used to adapt instructional practices to meet individual student needs as well as providing individual students corrective feedback that allows them to “reach” set goals and targets. Ongoing formative assessment is an integral part of effective instructional routines that provide teachers with the information they need to differentiate and make adjustments to instructional practice in order to meet the needs of individual students. When teachers know how students are progressing and where they are having trouble, they can use this information to make necessary instructional adjustments, such as re-teaching, trying alternative instructional approaches, or offering more opportunities for practice. The use of ongoing formative classroom assessment data is an imperative. Effective teachers seamlessly integrate formative assessment strategies into their daily instructional routines. Dillon William: PDE School Talk-Smart Investments © 2006 Success for All Foundation 9 9

10 Formative/Progress Monitoring (continued)
Examples: Response Cards, White Boards, Homework Curriculum-based measurement (Fuchs and Fuchs; Deno and Merkin; etc.) Dynamic Indicators of Basic Early Literacy Skills (DIBELS) AIMSWeb Edcheckup Formative Assessment: in Pennsylvania we are defining formative assessment as classroom based assessment that allows teachers to monitor and adjust their instructional practice in order to meet the individual needs of their students. Formative assessment can consist of formal instruments or informal observations. The key is how the results are used. Results should be used to shape teaching and learning. Black and Wiliam (1998) define formative assessment broadly to include instructional formats that teachers utilize in order to get information that when used diagnostically alter instructional practices and have a direct impact student learning and achievement. Under this definition, formative assessment encompasses questioning strategies, active engagement check-ins, (such as response cards, white boards, random selection, think-pair-share, popsicle sticks for open-ended questions, and numbered heads) and analysis of student work based on set rubrics and standards including homework and tests. Assessments are formative when the information is used to adapt instructional practices to meet individual student needs as well as providing individual students corrective feedback that allows them to “reach” set goals and targets. Ongoing formative assessment is an integral part of effective instructional routines that provide teachers with the information they need to differentiate and make adjustments to instructional practice in order to meet the needs of individual students. When teachers know how students are progressing and where they are having trouble, they can use this information to make necessary instructional adjustments, such as re-teaching, trying alternative instructional approaches, or offering more opportunities for practice. The use of ongoing formative classroom assessment data is an imperative. Effective teachers seamlessly integrate formative assessment strategies into their daily instructional routines. Dillon William: PDE School Talk-Smart Investments © 2006 Success for All Foundation

11 Diagnostic Purpose An integral part of an assessment plan
Identifies student’s strengths, weaknesses, knowledge and skills prior to instruction Provides additional information to plan effective instruction & targets for intervention An integral part of an assessment plan Administered infrequently Used to supplement information from screening, formative/progress monitoring, and summative/outcome Diagnostic Information : can be gained from diagnostic tests but also from other sources such as observations, response to instruction, etc. Might be used to fill in the gaps – to determine which letter sounds, or specific math skills, etc. are known and unknown Use to confirm or clarify the results of screening or benchmark test. May be helpful: if child is new to school and do not have longitudinal information; if the intervention is not working; to identify specific skill areas that need to be worked on; As follow-up to screening or benchmark assessment to identify target areas. Diagnostic Assessments: the purpose of diagnostic assessment is to ascertain, prior to instruction, each student’s strengths, weaknesses, knowledge, and skills. Establishing these permits the instructor to remediate students and adjust the curriculum to meet each pupil’s unique needs. Examples of diagnostic assessments are: DRA’s, Running Records, GRADE, GMADE © 2006 Success for All Foundation

12 Diagnostic Tests (continued)
Examples: GMADE, GRADE DRAs Running Records Cognitive Ability Scales (WISC, Stanford-Binet, etc.) Achievement Tests (WIAT, Woodcock-Johnson, etc.) Brigance Diagnostic Inventory or Basic Skills Comprehensive Test of Phonological Processing © 2006 Success for All Foundation

13 Benchmark Assessment Purpose An integral part of an assessment plan
Provides feedback on student progress towards proficiency on grade level standards Measures concepts, skills, applications (Anchors) Data informs instructional decision-making An integral part of an assessment plan Administered periodically during school year Standards-based Assessments Strong reliability and validity Benchmark Assessments: are designed to provide feedback to both the teacher and the student about how the student is progressing towards demonstrating proficiency on grade level standards. Well-designed benchmark assessments and standards-based assessments: measure the degree to which students have mastered a given concept measure concepts, skills, and/or applications are reported by referencing the standards, not other students' performance serve as a test to which teachers want to teach measure performance regularly, not only at a single moment in time Examples of benchmark assessments are: 4Sight, Riverside 9-12, DIBELS Pennsylvania’s benchmark assessment - 4Sight © 2006 Success for All Foundation

14 Benchmark Assessments
Examples 4Sight Benchmark Assessments Riverside Publishing Dynamic Indicators of Basic Early Literacy Skills (DIBELS) © 2006 Success for All Foundation

15 School Structures for Data-Informed Decision Making
District-Level Support (Budgetary Support, Professional Development, Resources and Time) Demographic/Perceptual/Process Data Student Learning Data Annual Building-wide Planning Process Focus: All Students Who: School-wide Team How: Data Retreat, School Planning Process School Level School Demographics Discipline Data Attendance Data Mobility Rate School Level PSSA & PVAAS Standardized Assessments District End-of-Year Tests Final Benchmark Test Grade/Course Level Initial: PSSA/PVAAS/final tests – class/subgroup levels Cyclical: 4Sight, Benchmark Data - grade level District quarterly assessments Common Classroom Data Periodic Grade Level Planning Process Focus: Groups of Students Who: Teacher Teams How: Regular 1-2 hour meetings Grade/Course Level Class Demographics Class Engagement Data Satisfaction Data Attendance Data The Student Planning Process reflects the practices of excellent teachers by which they continually monitor and adjust instruction on a daily basis with their children. In addition, teachers will have the information from the Monthly Planning Process meetings that will help to provide a focus for the delivery of their individual instructional plans with their own students. At this level, individual teachers are encouraged to document impressions and collect data that can be presented at the Monthly Planning Meetings to help provide insight into the progress of the instructional plan and to assist in the formulation of new plans and emphases for the next instructional cycle. Emphasize the two-sides arrows and how the data and conclusions flow between the levels on a routine basis. This communication flow enhances the effectiveness and impact of each planning process. Student Planning Process Focus: Classroom of Students Who: Teacher Classroom Level Initial: PSSA/PVAAS/finals, student Cyclical: 4Sight, Benchmark Data – Student Level Continuous Individual Classroom Assessments Progress Monitoring Classroom Level Qualitative Data Student Historical Information Student Medical Information Student Learning Information © 2006 Success for All Foundation

16 Assessment Activity Review list created earlier.
Review or revise the purpose of each assessment. © 2006 Success for All Foundation

17 What are 4Sight Benchmark Assessments?
Where do they fit? How do they fit? © 2006 Success for All Foundation

18 4Sight Benchmarks What do each of these items have in common?
The dipstick under the hood of a car A blood pressure cuff A weather thermometer © 2006 Success for All Foundation

19 4Sight Benchmarks. 4Sight Benchmarks
Another tool that can help direct our actions is 4Sight Benchmarks. © 2006 Success for All Foundation

20 Alignment of 4Sight with State Expectations
How do the 4Sight Benchmarks mirror the look and feel of the PSSA assessment? © 2006 Success for All Foundation

21 Alignment of 4Sight with State Expectations
Particularly note: Lengths of passages Types of passages Types of questions (what skill or strategy is being addressed) Types of question stems/or the format of the question Types of problems © 2006 Success for All Foundation

22 PSSA Student Reports for Reading Include:
Student reports include: a) Scale score b) Performance level: advanced, proficient, basic, below basic c) Reporting categories: Comprehension and Reading Skills (standards 1.1 and 1.2) Interpretation and Analysis of Fiction and Nonfiction Text (standards 1.1, 1.2, and 1.3) © 2006 Success for All Foundation

23 PSSA Reading Blueprint 2007
Reporting Category Gr. 3 Gr. 4 Gr. 5 Gr. 6 Gr. 7 Gr. 8 Gr. 11 Comprehension and Reading Skills 1.1 Learning to Read Independently 1.2 Reading Critically in All Content Areas % 60 -80% 60 – 80% 50 – 70% 40 – 60% % Interpretation and Analysis of Fiction and Nonfiction Text 1.3 Reading, Analyzing and Interpreting Literature 15 – 35% 20 – 40% 30 – 50% Source: © 2006 Success for All Foundation

24 PSSA Student Reports for Math Include:
Student reports include: a) Scale score b) Performance level: advanced, proficient, basic, below basic c) Reporting categories: Numbers and Operations (standards 2.1 and 2.2) Measurement (standard 2.3) Geometry (standards 2.9 and 2.10) Algebraic Concepts (standard 2.8) Data Analysis and Probability (standards 2.6 and 2.7) © 2006 Success for All Foundation

25 Numbers and Operations Data Analysis and Probability
PSSA Math Blueprint 2007 Reporting Category Gr. 3 Gr. 4 Gr. 5 Gr. 6 Gr. 7 Gr. 8 Gr. 11 Numbers and Operations 40 – 50% 43 – 47% 41 – 45% 28 – 32% 20 – 24% 18 – 22% % Measurement 12 – 15% Geometry 15 – 20% 12 – 18% Algebraic Concepts 13 – 17% 20 – 27% 25 – 30% 38 – 42% Data Analysis and Probability Source: Blueprint © 2006 Success for All Foundation

26 Data Provided by 4Sight Estimate of student performance on the PSSA
Advanced, Proficient, Basic, Below Basic Scaled Scores Subscale data Reporting Categories © 2006 Success for All Foundation

27 Performance Level Report
© 2006 Success for All Foundation

28 Reading Subscale Data Subscale Information Provided on 4Sight:
Comprehension and Reading Skills Interpretation and Analysis of Fiction and Nonfiction Text 1.1 Reading Independently 1.2 Reading Critically in All Content Areas 1.3 Reading Analyzing and Interpreting Literature Open-Response Items © 2006 Success for All Foundation

29 Reading Subscale Report
© 2006 Success for All Foundation

30 Mathematics Subscale Data
Subscale Information Provided on 4Sight: © 2006 Success for All Foundation

31 Mathematics Subscale Report
ADD SCREEN SHOT OF SUBSCALE REPORT SHOWING REPORTING CATEGORIES © 2006 Success for All Foundation 31

32 Administering 4Sight Benchmarks
Who takes 4Sight Benchmarks? Students in grades 3–11 take an on-grade-level assessment. Use the same modifications/exemptions that are used for the PSSA. © 2006 Success for All Foundation

33 Administering 4Sight Benchmarks
Who, What, When, Where, How Long? 60-minute testing period per subject Homeroom or other grade-level groups Grades 3-8: 5 tests; Grades 9-11: 4 tests Baseline and quarterly as needed School sets common date and time(s) Directions in Administration and Scoring Guide © 2006 Success for All Foundation

34 Administering 4Sight Benchmarks
Why 60-minutes per test? Correlations Test Validity Pilot Follow IEP for students w/ disabilities Differs from PSSA © 2006 Success for All Foundation 34 34

35 4Sight To-Do List: Preparing Staff to Administer 4Sight
Overview: What is it? Why are we doing it? Who takes it? What does it mean for me? Schoolwide testing and scoring plan Materials: test booklets, answer sheets/teacher score sheets, scoring masks, administration and scoring guide © 2006 Success for All Foundation

36 4Sight To-Do List: Developing Your School’s 4Sight Administration Plan
Staff 4Sight orientation: set date and draft agenda. Select dates for first benchmark administration. Decide who will administer the test such as homeroom teacher, etc. Decide who will distribute 4Sight materials. © 2006 Success for All Foundation

37 Scoring the 4Sight Benchmarks
Options for Administering & Scoring 4Sight Benchmark Assessments Hand-Scoring Scanning On-line School must score open-response items for all options. © 2006 Success for All Foundation

38 Scoring the 4Sight Benchmarks
HAND-SCORING ACTIVITY: Scoring Multiple-Choice Items Understanding Source of Subscales Understanding Calculation of Proficiency © 2006 Success for All Foundation

39 Scoring Open-Response Items
Reading & Mathematics The 3-point rubric used for PSSA reading is also used with 4Sight. The 4-point rubric used for PSSA math open-response items is also used for 4Sight. Each open-response item has item-specific examples to assist scoring, just as the PSSA. Plan for establishing inter-rater reliability. Transfer open-response item scores to answer sheet. © 2006 Success for All Foundation

40 Inter-Rater Reliability
Inter-rater reliability is established when different judges/teachers rate the similar performance or open-ended response items the same way. © 2006 Success for All Foundation

41 Establishing Inter-Rater Reliability
Inter-rater reliability increases when all raters have been trained in the use of a common rubric. Inter-rater reliability increases when two or more people score each response. At a minimum, if only one rater is available, teachers should not score their own students’ tests. © 2006 Success for All Foundation

42 Scoring the 4Sight Benchmarks
ACTIVITY: Establishing inter-rater reliability WHY? © 2006 Success for All Foundation

43 4Sight To-Do List: Developing a Scoring Plan
Who will score the test items? Multiple choice Open-response (Is training needed? Will scoring be done in teams? Etc.) Timeline Who will collect and organize the data? - Scoring, Scanning & Entering © 2006 Success for All Foundation

44 How do we get the data, now that we scored the assessments
How do we get the data, now that we scored the assessments? How do we analyze and interpret data in PA’s Standards Aligned System? © 2006 Success for All Foundation

45 Steps for Data Analysis, Discovery, & Solution Identification
Data-Informed Decision-Making Process ANALYSIS DATA SOLUTIONS DISCOVERY Steps for Data Analysis, Discovery, & Solution Identification Data Analysis Identify Proximity to Goal(s). Identify Areas of Concern. Develop Target(s). Discovery Brainstorm Root Causes. Prioritize and verify Root Causes with data. Select the Root Causes that are most important and doable. Keep the number low. Solutions Design interventions to address identified Root Causes. Evaluate results and readjust. © 2006 Success for All Foundation

46 Data Analysis Identify Proximity to Goal(s)
Our Goals for AYP 2008 have been set: Reading: 63% Proficient Mathematics: 56% Proficient © 2006 Success for All Foundation

47 AYP Plan for Pennsylvania
Step one of the data informed decision making framework is identifying goals… our goals have already been set for us… AYP Pennsylvania Consolidated State Application Accountability Workbook May 1, 2003 Revised May 30, 2003 Revised May 8, 2004 (Revised sections: 3.2, 5.4, 7.1, 7.2, 9.1, 9.2, and 10.1) for State Grants under Title IX, Part C, Section 9302 of the Elementary and Secondary Education Act (Public Law ) Available at: © 2006 Success for All Foundation

48 AYP Variables Which students count for AYP?
What are the identified subgroups? What is the minimum number of students for a subgroup to be included in AYP calculations? What is the date for determining which students are continuously enrolled? What is the participation rate? © 2006 Success for All Foundation

49 Calculating AYP in PA AYP Subgroups:
Number of students/ subgroup for reporting purposes: 10 Number of students/ subgroup for AYP calculation purposes: 40 White Black Hispanic/Latino Asian/Pacific Islander Native American 6. Multicultural 7. Low SES 8. ELL 9. Special Education Continuous Enrollment Date: October 1 Participation Rate Required: 95% © 2006 Success for All Foundation

50 Calculating AYP in PA Continuous Enrollment Date: October 1
Go into MC to show the project proficiency report… talk about it… Do not let participants go to this report. Continuous Enrollment Date: October 1 Participation Rate Required: 95% overall and for each subgroup Academic Measure 2006–07 2007–08 2008–09 % Proficient Safe Harbor Reading PSSA 54% 10% less Basic and Below Basic 63% Math 45% 56% © 2006 Success for All Foundation

51 4Sight AYP Reports RP 3533: Preliminary AYP Counts
RP 3379: Projections by Grade RP 3434: 4Sight Proficiency Projections © 2006 Success for All Foundation

52 RP 3533: Preliminary AYP Projections
© 2006 Success for All Foundation

53 RP 3379: AYP Projections by Grade
© 2006 Success for All Foundation

54 On-line tool to capture 4Sight data.
4Sight Member Center On-line tool to capture 4Sight data. © 2006 Success for All Foundation 54 54

55 To get a User ID & Password: https:// members.successforall.org
© 2006 Success for All Foundation

56 Member Center Allows schools to: sort and analyze data by:
whole school grade level Homeroom and/or class sections and periods reading group individual student track data over time to assess progress. © 2006 Success for All Foundation

57 Setting Up Your School in the Member Center
Member Center web address: Demo Site: Help Line: (800) , ext. 2563 Participants use provided Demo Account Information to get on the MC w/ trainer lead. © 2006 Success for All Foundation

58 Member Center Home Page
Check the News section of the Member Center every time you log into the site – this is where we will communicate important information about the Member Center and 4Sight to our member schools. You can change your logon and password via Profile Settings. See QSG page 4. © 2006 Success for All Foundation

59 4Sight – Two Types of Data
Proficiency Levels Subscale Data © 2006 Success for All Foundation

60 Data Dialogues Tertiary Questions: Please note the addition of term “Discovery” to match PA Data Informed Decision Making Process. © 2006 Success for All Foundation

61 They tend to be very, very simple.
OH 40: QuADS/Primary Quesions Purpose: Explanation of Primary Questions Key Point: What do we want to know from our data points? Content: Emphasize essential goals/easy questions/answers are easy. Primary questions are those that closely match your most essential goals. They tend to be very, very simple. © 2006 Success for All Foundation

62 Steps for Data Analysis, Discover, and Solution Identification
OH 16: Steps for Selecting and Addressing Root Causes Purpose: Key Point: Content: Transition: Data Analysis Identify Proximity to Goal(s). Identify Areas of Concern. Develop Target(s). Discovery Brainstorm Root Causes. Prioritize and verify Root Causes with data. Select the Root Causes that are most important and doable. Keep the number low. Solutions Design interventions to address identified Root Causes. Evaluate results and readjust. Activity a © 2006 Success for All Foundation

63 Analyzing Proficiency Reports Primary Questions
How many students are proficient in reading/math? How many students in this subgroup are proficient? How many students do you need in that subgroup to make AYP? How many more quarters until the PSSA is given? Who are the students just below proficiency? How many points do these students need to move up to be proficient? How many points per quarter? Trainer will review all Primary Questions answered with reports. Trainer will model going to MC demo and model accessing these reports on slide. Then Using the reports participants brought or getting in the MC, trainer will have them answer the Primary Questions on activity sheet “Data Analysis Questions.” © 2006 Success for All Foundation

64 Member Center Primary Question Reports
Under Reports – School Reports: RP 3434: 4Sight Proficiency Projections RP 3869: Test Results Chart & Graph (performance levels) RP 3508: Predicted State Results Under Testing Center: Proficiency Report © 2006 Success for All Foundation

65 RP 3434: 4Sight Proficiency Projections
Begin Member Center practice with RP 3434. © 2006 Success for All Foundation

66 Primary Questions How many students are proficient school-wide in Reading? Who are the students performing at the Basic Level in Reading? © 2006 Success for All Foundation 66

67 Activity: Primary Questions
How many students are proficient school-wide in Mathematics? Who are the students performing at the Basic Level in Mathematics? © 2006 Success for All Foundation 67

68 RP 3869: Test Results Chart and Graph
Snag it from MC © 2006 Success for All Foundation

69 Proficiency Reports Demonstrate through Testing Center
© 2006 Success for All Foundation

70 RP 3508: Predicted State Results
© 2006 Success for All Foundation

71 Primary Question Tool: Administration & Scoring Guide
4Sight Pennsylvania Benchmark Answer and Alignment Guide: Estimation of Student Performance on PSSA chart © 2006 Success for All Foundation

72 Estimation of Student Performance on PSSA
PA Reading Grade 5 Numbers 1, 3, and 5 Estimation of Student Performance on PSSA 4Sight Total Score PSSA Performance Level Score 5 832 6 865 7 898 8 931 9 964 10 996 11 1029 12 1062 13 1095 14 1128 15 1161 16 1194 17 1226 18 1259 19 1292 20 1325 21 1358 Performance Level Score Ranges for PSSA Performance Standards, Grade 5 Performance Level Score Advanced 1497 and above Proficient Basic Below Basic 1136 and below © 2006 Success for All Foundation

73 Data Dialogues Tertiary Questions: Please note the addition of term “Discovery” to match PA Data Informed Decision Making Process. © 2006 Success for All Foundation

74 They tend to be more specific.
OH 59: QuADs—Secondary Questions Purpose: To introduce and discuss Secondary Questions Content: Participants/Schools will begin to review Subscale Data. Secondary questions discover what strengths and weaknesses are according to student learning. They tend to be more specific. © 2006 Success for All Foundation

75 Steps for Data Analysis, Discovery, and Solution Identification
OH 16: Steps for Selecting and Addressing Root Causes Purpose: Key Point: Content: Transition: Data Analysis Identify Proximity to Goal(s). Identify Areas of Concern. Develop Target(s). Discovery Brainstorm Root Causes. Prioritize and verify Root Causes with data. Select the Root Causes that are most important and doable. Keep the number low. Solutions Design interventions to address identified Root Causes. Evaluate results and readjust. Activity a © 2006 Success for All Foundation

76 Analyzing Subscale Data Secondary Questions
What are the strengths identified on the graph? What are the areas for concern? If the graph is flat, are the students proficient readers or close to nonreaders? Trainer will review all Secondary Questions answered with reports. Trainer will model going to MC demo and model accessing these reports on slide. Go up through developing a reader. © 2006 Success for All Foundation

77 Member Center Secondary Question Reports
Under Reports – School Reports: RP 3304: Subscale Test Results RP 3861: Subscale Averages Graph Under Testing Center: Subscale Test Results © 2006 Success for All Foundation 77 77

78 Subscale Reports © 2006 Success for All Foundation

79 4Sight Data Analysis +  © 2006 Success for All Foundation

80 RP 3861 Subscale Averages Graph
Snag it from MC © 2006 Success for All Foundation

81 RPT 3304: Subscale Test Results
© 2006 Success for All Foundation

82 ACTIVITY ADD ACTIVITY – 3 QUESTIONS HERE…
© 2006 Success for All Foundation

83 Steps for Data Analysis, Discovery, and Solution Identification
OH 16: Steps for Selecting and Addressing Root Causes Purpose: Key Point: Content: Transition: Data Analysis Identify Proximity to Goal(s). Identify Areas of Concern. Develop Target(s). Discovery Brainstorm Root Causes. Prioritize and verify Root Causes with data. Select the Root Causes that are most important and doable. Keep the number low. Solutions Design interventions to address identified Root Causes. Evaluate results and readjust. Activity a © 2006 Success for All Foundation

84 Member Center Secondary Question Tools
4Sight Pennsylvania Benchmark Reporting Categories: (Admin. Guide – PA standards condensed version) 4Sight Pennsylvania Benchmark Answer and Alignment Guide: (Admin. Guide – Reporting Category, PA Standard, Assessment Anchor) RP 3746: 4Sight Item Analysis by Subscale: (Under Reports – School Reports) © 2006 Success for All Foundation

85 Reporting Categories © 2006 Success for All Foundation

86 4Sight Scale Reporting Category
Pennsylvania 4Sight Reading Benchmark: Grade 5 Number 1 Answer and Alignment Guide Item No. 4Sight Scale Reporting Category Assessment Anchor Descriptor Correct Answer (MC) PA Reporting Category PA Standard 1 Interpretation & Analysis Reading Critically 1.2.5.A R5.B.3.3.2 Headings C 2 Comprehension & Reading Skills R5.A.2.4.1 Supporting details B 3 R5.B.3.1.1 Fact/opinion D 4 Reading Independently 1.1.5.F R5.A.2.1.1 Multiple meanings 5 A 6 1.1.5.A R5.A.2.6.1 Author’s purpose 7 R5.A.2.5.1 Main idea © 2006 Success for All Foundation

87 RP 3746: Item Analysis by Subscale – Snag it from MC
© 2006 Success for All Foundation

88 4Sight Reading and Math Benchmark Assessment Initial Set-up Checklist
4Sight To-Do List: 4Sight Reading and Math Benchmark Assessment Initial Set-up Checklist School Year We placed an order for 4Sight on-line for each building using 4Sight. We decided how 4Sight will be scored (hand-scored, scanned, on-line) and ordered the scanner, if necessary (and have reviewed the tech requirements and ordered scanning software from Member Center if scanning OR ext. 2563)). We identified a main school district and building level contact for 4Sight. We have registered our district and schools in the 4Sight Member Center, including assigning teachers log-in information and appropriate security level access. We have input student demographic information in the 4Sight Member Center. We have developed a training plan to introduce 4Sight to the staff and explain its design, purpose, administration, scoring and how it fits in existing school structures. We have completed the plan for receiving the shipped 4Sight assessments in our buildings throughout the school year (see sample tracking log). We identified a testing plan and testing schedule (EAP Districts refer to the testing schedule in the EAP Guidelines) for 4Sight. Will the entire staff be involved? How many times per year will we administer 4Sight? When will 4Sight be administered? Will we set a common time across the building for 4Sight administration? How will we address the needs of students with accommodations and modifications in their IEPs or 504 Plans? Will students be placed in the same groups for 4Sight administration as they are for PSSA? We have developed a training plan for analysis of 4Sight (and other relevant) data that follows the 4Sight testing schedule so staff has the opportunity to analyze and respond to the data. We have a plan for scoring (and scoring training including inter-rater reliability) 4Sight including the open-ended response items. We have determined a plan for sharing this data with students, parents, etc (as necessary). © 2006 Success for All Foundation

89 III. What 4Sight is and is not…
4Sight is not… a low stakes assessment. a high stakes assessment. administered on grade level. administered on instructional level*. (*unless student on grade level) an assessment that gives information on progress towards proficiency on PSSA. a diagnostic assessment that gives information on specific skills. administered 3-5 times a year as a snapshot. administered regularly as a progress monitoring tool for students performing below grade level. Keep but move until later © 2006 Success for All Foundation


Download ppt "4Sight Benchmark Assessments"

Similar presentations


Ads by Google