We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byAlondra Hext
Modified over 2 years ago
1 4Sight Benchmark Assessments A Data Tool in Pennsylvania’s Standards Aligned System Summer 2008 © 2006 Success for All Foundation ZZ4146
© 2006 Success for All Foundation2 Initial Training Objectives Understand the design and purpose of 4Sight Reading and Math Benchmark Assessments as part of the Standards- Aligned Systems. Facilitate the administration of 4Sight Reading and Math Benchmark Assessments. Facilitate the scoring of 4Sight Reading and Math Benchmark Assessments. Begin to develop plans to prepare staff to administer, score, and utilize 4Sight. Generate and read proficiency and subscale reports in the Member Center.
Pennsylvania ’ s Standards-Aligned System April 2008
© 2006 Success for All Foundation4 Assessment Activity Brainstorm and list all of the assessments you administer (district, building, grade level, classroom).
© 2006 Success for All Foundation5 Fundamental to ensuring usefulness of an assessment is: I. Understanding the purpose for which the assessment was developed; II. Understanding the purpose for which it is being administered; and III. Understanding how the data is used and fits in school structures.
© 2006 Success for All Foundation6 Types of Assessment in Standards Aligned System Standards Aligned System Types of Assessment in Standards Aligned System Standards Aligned System The assessment was developed as: Summative/Outcome: “Reaching Our Goals” Formative/Progress Monitoring: “Growth Charts” Diagnostics: “In-depth View” Benchmark: “Progress Towards Standards” http://www.portal.state.pa.us/portal/server.pt?open=512&objID=3188&&level=1&css=L1&mode=2
© 2006 Success for All Foundation7 Summative/Outcome Purpose Reports how a student compares to a certain standard State or local accountability Helps to draw conclusions regarding groups of students Cannot inform current instructional practices An integral part of an assessment plan Occurs at end of school level, grade, or course or at certain grades Provides outcome data
© 2006 Success for All Foundation8 Summative/Outcome Measures (continued) Examples PSSA Unit exams Final exams End of year grades Terra Nova
© 2006 Success for All Foundation9 Purpose Provides data regarding impact of instruction on learning Data used to adapt instructional practices to meet individual student need An integral part of an assessment plan Given frequently to determine student growth Classroom based assessments May range from formal instruments to informal observations Results used to shape teaching and student learning Black and Wiliam 1998 Formative/Progress Monitoring Conducted at Student’s Instructional Level
© 2006 Success for All Foundation10 Formative/Progress Monitoring (continued) Examples: Response Cards, White Boards, Homework Curriculum-based measurement (Fuchs and Fuchs; Deno and Merkin; etc.) Dynamic Indicators of Basic Early Literacy Skills (DIBELS) AIMSWeb Edcheckup
© 2006 Success for All Foundation11 Diagnostic Purpose Identifies student’s strengths, weaknesses, knowledge and skills prior to instruction Provides additional information to plan effective instruction & targets for intervention An integral part of an assessment plan Administered infrequently Used to supplement information from screening, formative/progress monitoring, and summative/outcome
© 2006 Success for All Foundation12 Diagnostic Tests (continued) Examples: GMADE, GRADE DRAs Running Records Cognitive Ability Scales (WISC, Stanford-Binet, etc.) Achievement Tests (WIAT, Woodcock-Johnson, etc.) Brigance Diagnostic Inventory or Basic Skills Comprehensive Test of Phonological Processing
© 2006 Success for All Foundation13 Benchmark Assessment Purpose Provides feedback on student progress towards proficiency on grade level standards Measures concepts, skills, applications (Anchors) Data informs instructional decision- making An integral part of an assessment plan Administered periodically during school year Standards-based Assessments Strong reliability and validity
© 2006 Success for All Foundation14 Benchmark Assessments Examples 4Sight Benchmark Assessments Riverside Publishing Dynamic Indicators of Basic Early Literacy Skills (DIBELS)
© 2006 Success for All Foundation15 Student Planning Process Focus: Classroom of Students Who: Teacher Periodic Grade Level Planning Process Focus: Groups of Students Who: Teacher Teams How: Regular 1-2 hour meetings Annual Building-wide Planning Process Focus: All Students Who: School-wide Team How: Data Retreat, School Planning Process District-Level Support (Budgetary Support, Professional Development, Resources and Time) Student Learning Data School Structures for Data-Informed Decision Making School Level PSSA & PVAAS Standardized Assessments District End-of-Year Tests Final Benchmark Test Classroom Level Initial: PSSA/PVAAS/finals, student Cyclical: 4Sight, Benchmark Data – Student Level Continuous Individual Classroom Assessments Progress Monitoring Grade/Course Level Initial: PSSA/PVAAS/final tests – class/subgroup levels Cyclical: 4Sight, Benchmark Data - grade level District quarterly assessments Common Classroom Data Demographic/Perceptual/Process Data School Level School Demographics Discipline Data Attendance Data Mobility Rate Grade/Course Level Class Demographics Class Engagement Data Satisfaction Data Attendance Data Classroom Level Qualitative Data Student Historical Information Student Medical Information Student Learning Information
© 2006 Success for All Foundation16 Assessment Activity Review list created earlier. Review or revise the purpose of each assessment.
© 2006 Success for All Foundation17 What are 4Sight Benchmark Assessments? Where do they fit? How do they fit?
© 2006 Success for All Foundation18 What do each of these items have in common? The dipstick under the hood of a car A blood pressure cuff A weather thermometer 4Sight Benchmarks
© 2006 Success for All Foundation19 4Sight Benchmarks Another tool that can help direct our actions is 4Sight Benchmarks.
© 2006 Success for All Foundation20 Alignment of 4Sight with State Expectations How do the 4Sight Benchmarks mirror the look and feel of the PSSA assessment?
© 2006 Success for All Foundation21 Alignment of 4Sight with State Expectations Particularly note: Lengths of passages Types of passages Types of questions (what skill or strategy is being addressed) Types of question stems/or the format of the question Types of problems
© 2006 Success for All Foundation22 PSSA Student Reports for Reading Include: Student reports include: a) Scale score b) Performance level: advanced, proficient, basic, below basic c) Reporting categories: Comprehension and Reading Skills (standards 1.1 and 1.2) Interpretation and Analysis of Fiction and Nonfiction Text (standards 1.1, 1.2, and 1.3)
© 2006 Success for All Foundation23 PSSA Reading Blueprint 2007 Reporting CategoryGr. 3Gr. 4Gr. 5Gr. 6Gr. 7Gr. 8Gr. 11 Comprehension and Reading Skills 1.1 Learning to Read Independently 1.2 Reading Critically in All Content Areas 65 - 85% 60 - 80% 60 – 80% 50 – 70% 40 – 60% 40 - 60% Interpretation and Analysis of Fiction and Nonfiction Text 1.1 Learning to Read Independently 1.2 Reading Critically in All Content Areas 1.3 Reading, Analyzing and Interpreting Literature 15 – 35% 20 – 40% 30 – 50% 40 – 60%
© 2006 Success for All Foundation24 PSSA Student Reports for Math Include: Student reports include: a) Scale score b) Performance level: advanced, proficient, basic, below basic c) Reporting categories: Numbers and Operations (standards 2.1 and 2.2) Measurement (standard 2.3) Geometry (standards 2.9 and 2.10) Algebraic Concepts (standard 2.8) Data Analysis and Probability (standards 2.6 and 2.7)
© 2006 Success for All Foundation25 PSSA Math Blueprint 2007 Reporting Category Gr. 3Gr. 4Gr. 5Gr. 6Gr. 7Gr. 8Gr. 11 Numbers and Operations 40 – 50% 43 – 47% 41 – 45% 28 – 32% 20 – 24% 18 – 22% 12 - 15% Measurement12 – 15% Geometry12 – 15% 15 – 20% 12 – 18% Algebraic Concepts 12 – 15% 13 – 17% 15 – 20% 20 – 27% 25 – 30% 38 – 42% Data Analysis and Probability 12 – 15% 15 – 20% 12 – 18%
© 2006 Success for All Foundation26 Data Provided by 4Sight 1.Estimate of student performance on the PSSA Advanced, Proficient, Basic, Below Basic Scaled Scores 2.Subscale data Reporting Categories
© 2006 Success for All Foundation27 Performance Level Report
© 2006 Success for All Foundation28 Reading Subscale Data Subscale Information Provided on 4Sight: Comprehension and Reading Skills Interpretation and Analysis of Fiction and Nonfiction Text 1.1Reading Independently 1.2Reading Critically in All Content Areas 1.3Reading Analyzing and Interpreting Literature Open-Response Items
© 2006 Success for All Foundation29 Reading Subscale Report
© 2006 Success for All Foundation30 Mathematics Subscale Data Subscale Information Provided on 4Sight:
© 2006 Success for All Foundation31 Mathematics Subscale Report ADD SCREEN SHOT OF SUBSCALE REPORT SHOWING REPORTING CATEGORIES
© 2006 Success for All Foundation32 Administering 4Sight Benchmarks Who takes 4Sight Benchmarks? Students in grades 3–11 take an on-grade-level assessment. Use the same modifications/exemptions that are used for the PSSA.
© 2006 Success for All Foundation33 Administering 4Sight Benchmarks Who, What, When, Where, How Long? 60-minute testing period per subject Homeroom or other grade-level groups Grades 3-8: 5 tests; Grades 9-11: 4 tests Baseline and quarterly as needed School sets common date and time(s) Directions in Administration and Scoring Guide
© 2006 Success for All Foundation34 Administering 4Sight Benchmarks Why 60-minutes per test? Correlations Test Validity Pilot Follow IEP for students w/ disabilities Differs from PSSA
© 2006 Success for All Foundation35 4Sight To-Do List: Preparing Staff to Administer 4Sight 1.Overview: What is it? Why are we doing it? Who takes it? What does it mean for me? 2.Schoolwide testing and scoring plan 3.Materials: test booklets, answer sheets/teacher score sheets, scoring masks, administration and scoring guide
© 2006 Success for All Foundation36 4Sight To-Do List: Developing Your School’s 4Sight Administration Plan 1.Staff 4Sight orientation: set date and draft agenda. 2.Select dates for first benchmark administration. 3.Decide who will administer the test such as homeroom teacher, etc. 4.Decide who will distribute 4Sight materials.
© 2006 Success for All Foundation37 Scoring the 4Sight Benchmarks Options for Administering & Scoring 4Sight Benchmark Assessments 1.Hand-Scoring 2.Scanning 3.On-line School must score open-response items for all options.
© 2006 Success for All Foundation38 Scoring the 4Sight Benchmarks HAND-SCORING ACTIVITY: Scoring Multiple-Choice Items 1. Understanding Source of Subscales 2. Understanding Calculation of Proficiency
© 2006 Success for All Foundation39 Scoring Open-Response Items Reading & Mathematics 1.The 3-point rubric used for PSSA reading is also used with 4Sight. 2.The 4-point rubric used for PSSA math open-response items is also used for 4Sight. 3.Each open-response item has item-specific examples to assist scoring, just as the PSSA. 4.Plan for establishing inter-rater reliability. 5.Transfer open-response item scores to answer sheet.
© 2006 Success for All Foundation40 Inter-Rater Reliability Inter-rater reliability is established when different judges/teachers rate the similar performance or open-ended response items the same way.
© 2006 Success for All Foundation41 Establishing Inter-Rater Reliability Inter-rater reliability increases when all raters have been trained in the use of a common rubric. Inter-rater reliability increases when two or more people score each response. At a minimum, if only one rater is available, teachers should not score their own students’ tests.
© 2006 Success for All Foundation42 Scoring the 4Sight Benchmarks ACTIVITY: Establishing inter-rater reliability WHY?
© 2006 Success for All Foundation43 4Sight To-Do List: Developing a Scoring Plan Who will score the test items? -Multiple choice -Open-response (Is training needed? Will scoring be done in teams? Etc.) -Timeline Who will collect and organize the data? - Scoring, Scanning & Entering
© 2006 Success for All Foundation44 How do we get the data, now that we scored the assessments? How do we analyze and interpret data in PA’s Standards Aligned System?
© 2006 Success for All Foundation45 Data Analysis Identify Proximity to Goal(s). Identify Areas of Concern. Develop Target(s). Discovery Brainstorm Root Causes. Prioritize and verify Root Causes with data. Select the Root Causes that are most important and doable. Keep the number low. Solutions Design interventions to address identified Root Causes. Evaluate results and readjust. Steps for Data Analysis, Discovery, & Solution Identification Data-Informed Decision-Making Process
Data Analysis Identify Proximity to Goal(s) © 2006 Success for All Foundation46 Our Goals for AYP 2008 have been set: Reading: 63% Proficient Mathematics: 56% Proficient
© 2006 Success for All Foundation47 AYP Plan for Pennsylvania Pennsylvania Consolidated State Application Accountability Workbook May 1, 2003 Revised May 30, 2003 Revised May 8, 2004 (Revised sections: 3.2, 5.4, 7.1, 7.2, 9.1, 9.2, and 10.1) for State Grants under Title IX, Part C, Section 9302 of the Elementary and Secondary Education Act (Public Law 107-110) Available at: http://www.ed.gov/admins/lead/account/stateplans03/index.htmlhttp://www.ed.gov/admins/lead/account/stateplans03/index.html
© 2006 Success for All Foundation48 AYP Variables Which students count for AYP? What are the identified subgroups? What is the minimum number of students for a subgroup to be included in AYP calculations? What is the date for determining which students are continuously enrolled? What is the participation rate?
© 2006 Success for All Foundation49 Calculating AYP in PA AYP Subgroups: Number of students/ subgroup for reporting purposes: 10 Number of students/ subgroup for AYP calculation purposes: 40 1.White 2.Black 3.Hispanic/Latino 4.Asian/Pacific Islander 5.Native American 6.Multicultural 7.Low SES 8.ELL 9.Special Education Continuous Enrollment Date: October 1 Participation Rate Required: 95%
© 2006 Success for All Foundation50 Calculating AYP in PA Continuous Enrollment Date: October 1 Participation Rate Required: 95% overall and for each subgroup Academic Measure 2006–072007–082008–09 % Proficient Safe Harbor % Proficient Safe Harbor % Proficient Safe Harbor Reading PSSA 54% 10% less Basic and Below Basic 63% 10% less Basic and Below Basic 63% 10% less Basic and Below Basic Math PSSA 45%56%
© 2006 Success for All Foundation51 4Sight AYP Reports RP 3533: Preliminary AYP Counts RP 3379: Projections by Grade RP 3434: 4Sight Proficiency Projections
© 2006 Success for All Foundation52 RP 3533: Preliminary AYP Projections
© 2006 Success for All Foundation53 RP 3379: AYP Projections by Grade
© 2006 Success for All Foundation54 4Sight Member Center On-line tool to capture 4Sight data.
© 2006 Success for All Foundation55 To get a User ID & Password: https:// members.successforall.org
© 2006 Success for All Foundation56 Member Center Allows schools to: sort and analyze data by: -whole school -grade level -Homeroom and/or class sections and periods -reading group -individual student track data over time to assess progress.
© 2006 Success for All Foundation57 Member Center web address: https://members.successforall.net Demo Site: http://demo.successforall.net Help Line: (800) 548-4998, ext. 2563 firstname.lastname@example.org Setting Up Your School in the Member Center
© 2006 Success for All Foundation58 Member Center Home Page Check the News section of the Member Center every time you log into the site – this is where we will communicate important information about the Member Center and 4Sight to our member schools. You can change your logon and password via Profile Settings. See QSG page 4.
© 2006 Success for All Foundation59 4Sight – Two Types of Data 1.Proficiency Levels 2.Subscale Data
© 2006 Success for All Foundation60 Data Dialogues
© 2006 Success for All Foundation61 Primary questions are those that closely match your most essential goals. They tend to be very, very simple.
© 2006 Success for All Foundation62 Data Analysis Identify Proximity to Goal(s). Identify Areas of Concern. Develop Target(s). Discovery Brainstorm Root Causes. Prioritize and verify Root Causes with data. Select the Root Causes that are most important and doable. Keep the number low. Solutions Design interventions to address identified Root Causes. Evaluate results and readjust. Steps for Data Analysis, Discover, and Solution Identification
© 2006 Success for All Foundation63 Analyzing Proficiency Reports Primary Questions How many students are proficient in reading/math? How many students in this subgroup are proficient? How many students do you need in that subgroup to make AYP? How many more quarters until the PSSA is given? Who are the students just below proficiency? How many points do these students need to move up to be proficient? How many points per quarter?
© 2006 Success for All Foundation64 Member Center Primary Question Reports Under Reports – School Reports: RP 3434: 4Sight Proficiency Projections RP 3869: Test Results Chart & Graph (performance levels) RP 3508: Predicted State Results Under Testing Center: Proficiency Report
© 2006 Success for All Foundation65 RP 3434: 4Sight Proficiency Projections
© 2006 Success for All Foundation66 Primary Questions How many students are proficient school-wide in Reading? Who are the students performing at the Basic Level in Reading?
© 2006 Success for All Foundation67 Activity: Primary Questions How many students are proficient school-wide in Mathematics? Who are the students performing at the Basic Level in Mathematics?
© 2006 Success for All Foundation68 RP 3869: Test Results Chart and Graph Snag it from MC
© 2006 Success for All Foundation69 Proficiency Reports
© 2006 Success for All Foundation70 RP 3508: Predicted State Results
© 2006 Success for All Foundation71 Primary Question Tool: Administration & Scoring Guide 4Sight Pennsylvania Benchmark Answer and Alignment Guide: Estimation of Student Performance on PSSA chart
© 2006 Success for All Foundation72 PA Reading Grade 5 Numbers 1, 3, and 5 Estimation of Student Performance on PSSA PA Reading Grade 5 Numbers 1, 3, and 5 Estimation of Student Performance on PSSA 4Sight Total Score PSSA Performance Level Score 5832 6865 7898 8931 9964 10996 111029 121062 131095 141128 151161 161194 171226 181259 191292 201325 211358 Performance Level Score Ranges for PSSA Performance Standards, Grade 5 Performance LevelScore Advanced1497 and above Proficient1275-1496 Basic1137-1274 Below Basic1136 and below
© 2006 Success for All Foundation73 Data Dialogues
© 2006 Success for All Foundation74 Secondary questions discover what strengths and weaknesses are according to student learning. They tend to be more specific.
© 2006 Success for All Foundation75 Data Analysis Identify Proximity to Goal(s). Identify Areas of Concern. Develop Target(s). Discovery Brainstorm Root Causes. Prioritize and verify Root Causes with data. Select the Root Causes that are most important and doable. Keep the number low. Solutions Design interventions to address identified Root Causes. Evaluate results and readjust. Steps for Data Analysis, Discovery, and Solution Identification
© 2006 Success for All Foundation76 Analyzing Subscale Data Secondary Questions What are the strengths identified on the graph? What are the areas for concern? If the graph is flat, are the students proficient readers or close to nonreaders?
© 2006 Success for All Foundation77 Member Center Secondary Question Reports Under Reports – School Reports: RP 3304: Subscale Test Results RP 3861: Subscale Averages Graph Under Testing Center: Subscale Test Results
© 2006 Success for All Foundation78 Subscale Reports
© 2006 Success for All Foundation79 4Sight Data Analysis ++
© 2006 Success for All Foundation80 RP 3861 Subscale Averages Graph Snag it from MC
© 2006 Success for All Foundation81 RPT 3304: Subscale Test Results
© 2006 Success for All Foundation82 ACTIVITY ADD ACTIVITY – 3 QUESTIONS HERE…
© 2006 Success for All Foundation83 Data Analysis Identify Proximity to Goal(s). Identify Areas of Concern. Develop Target(s). Discovery Brainstorm Root Causes. Prioritize and verify Root Causes with data. Select the Root Causes that are most important and doable. Keep the number low. Solutions Design interventions to address identified Root Causes. Evaluate results and readjust. Steps for Data Analysis, Discovery, and Solution Identification
© 2006 Success for All Foundation84 Member Center Secondary Question Tools 4Sight Pennsylvania Benchmark Reporting Categories: (Admin. Guide – PA standards condensed version) 4Sight Pennsylvania Benchmark Answer and Alignment Guide: (Admin. Guide – Reporting Category, PA Standard, Assessment Anchor) RP 3746: 4Sight Item Analysis by Subscale: (Under Reports – School Reports)
© 2006 Success for All Foundation85 Reporting Categories
© 2006 Success for All Foundation86 Pennsylvania 4Sight Reading Benchmark: Grade 5 Number 1 Answer and Alignment Guide Item No. 4Sight Scale Reporting Category Assessment Anchor DescriptorCorrect Answer (MC) PA Reporting Category PA Standard 1 Interpretation & Analysis Reading Critically 1.2.5.AR5.B.3.3.2HeadingsC 2 Comprehension & Reading Skills Reading Critically 1.2.5.AR5.A.2.4.1Supporting detailsB 3 Interpretation & Analysis Reading Critically 1.2.5.AR5.B.3.1.1Fact/opinionD 4 Comprehension & Reading Skills Reading Independently 1.1.5.FR5.A.2.1.1Multiple meaningsC 5 Comprehension & Reading Skills Reading Critically 1.2.5.AR5.A.2.4.1Supporting detailsA 6 Comprehension & Reading Skills Reading Independently 1.1.5.AR5.A.2.6.1Author’s purposeD 7 Comprehension & Reading Skills Reading Critically 1.2.5.AR5.A.2.5.1Main ideaA
© 2006 Success for All Foundation87 RP 3746: Item Analysis by Subscale – Snag it from MC
© 2006 Success for All Foundation88 4Sight To-Do List: 4Sight Reading and Math Benchmark Assessment Initial Set-up Checklist 2008-2009 School Year We placed an order for 4Sight on-line for each building using 4Sight. We decided how 4Sight will be scored (hand-scored, scanned, on-line) and ordered the scanner, if necessary (and have reviewed the tech requirements and ordered scanning software from Member Center if scanning OR 800.548.4998 ext. 2563)). We identified a main school district and building level contact for 4Sight. We have registered our district and schools in the 4Sight Member Center, including assigning teachers log-in information and appropriate security level access. We have input student demographic information in the 4Sight Member Center. We have developed a training plan to introduce 4Sight to the staff and explain its design, purpose, administration, scoring and how it fits in existing school structures. We have completed the plan for receiving the shipped 4Sight assessments in our buildings throughout the school year (see sample tracking log). We identified a testing plan and testing schedule (EAP Districts refer to the testing schedule in the EAP Guidelines) for 4Sight. Will the entire staff be involved? How many times per year will we administer 4Sight? When will 4Sight be administered? Will we set a common time across the building for 4Sight administration? How will we address the needs of students with accommodations and modifications in their IEPs or 504 Plans? Will students be placed in the same groups for 4Sight administration as they are for PSSA? We have developed a training plan for analysis of 4Sight (and other relevant) data that follows the 4Sight testing schedule so staff has the opportunity to analyze and respond to the data. We have a plan for scoring (and scoring training including inter-rater reliability) 4Sight including the open-ended response items. We have determined a plan for sharing this data with students, parents, etc (as necessary).
© 2006 Success for All Foundation89 III. What 4Sight is and is not… 4Sight is…4Sight is not… a low stakes assessment.a high stakes assessment. administered on grade level.administered on instructional level*. (*unless student on grade level) an assessment that gives information on progress towards proficiency on PSSA. a diagnostic assessment that gives information on specific skills. administered 3-5 times a year as a snapshot. administered regularly as a progress monitoring tool for students performing below grade level.
1 PDE Benchmark Initiative An Introduction to the 4Sight Benchmarks September 9, 2005 PaTTAN Pittsburgh.
Continuous School Improvement Planning: Developing a School Improvement Plan October 24, 2011 Intermediate Unit 1 Instructional Support Services.
Introduction to Creating a Balanced Assessment System Presented by: Illinois State Board of Education.
Connecting the Process to: -Current Practice -CEP -CIITS/EDS 1.
Developing School Improvement Plans #101 Requirements – Resources – Successful Plans June 2010.
Pennsylvania Department of Education PROGRESS & & ACHIEVEMENT Pennsylvania Value-Added Assessment System.
PVAAS + Other Data Consultation 2013 PVAAS AND OTHER DATA TOOLS SCHOOL CONSULTATION FALL 2013.
National Center on Response to Intervention RTI Implementer Webinar Series: Establishing a Screening Process 1.
EasyCBM: Benchmarking and Progress Monitoring System Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information Shereen Henry Math Instructional.
10705ZZ4211 Pennsylvania Benchmarks Data Analysis and Discovery (Root Cause Analysis)
Pennsylvania Value-Added Assessment System (PVAAS) High Growth, High Achieving Schools: Is It Possible? Fall, 2011 PVAAS Webinar.
1 R-2 Report: Success in algebra by the end of ninth grade A presentation to the Board of Education by Brad Stam, Chief Academic Officer Instructional.
Compass: Module 3 Compass Requirements: Teachers’ Overall Evaluation Rating Student Growth Student Learning Targets (SLTs) Value-added Score (VAM) where.
1 Phase III: Planning Action Developing Improvement Plans.
Northwest ISD Target Improvement Plan Seven Hills Elementary Northwest ISD Public Hearing October 26, 2015.
Introduction to Teacher Evaluation August 20, 2014 Elizabeth M. Osga, Ph.D.
Enhancing our guiding philosophy of continuous improvement -Professional Learning Communities.
Building Leadership Team October Agenda Big Picture Formative Overview PLC Overview SMART Goal and Action Plan Plan.
Data to Instruction Eric Lech Session Objectives Understand differences in assessments and their impact on instruction Understand the.
RESPONSE TO INTERVENTION Policy & Practice Institute June 25, 2008 Mike Stetter and Lori Duerr Delaware Department of Education.
1 What will the work look like per quarter to help reach the goal? School wide goal: In order to increase student achievement Brown School will use a balanced.
NCAAAI UPDATES AND AUDIT INFORMATION Accountability Conference 2005 Sheila Garner Brown Technical Outreach for Public Schools North Carolina State University.
1 Career Pathways for All Students PreK-14 2 Compiled by Sue Updegraff Keystone AEA Information from –Iowa Career Pathways –Iowa School-to-Work –Iowa.
State-wide Assessment Update for What Does TNs Alternate Assessment Program Look Like Now? Alternate Assessment General Assessment Alternate.
School-Based Problem-Solving for Individuals (SBIT)
Pennsylvania Department of Education, Pennsylvania Training and Technical Assistance Network Special Education Administrators: What Updates and Resources.
AR Refresher Have you been using AR to its full capacity?
Student Growth Goals Professional Learning Jenny Ray, PGES Consultant (KDE) 1.
1 Using Standards Aligned System to Ensure 21st Century Teaching and Learning December 8, 2009 HOMEROOM 3 Fair Assessment & Materials and Resources.
1 Janet Hensley Pam Lange Barb Rowenhorst Meade School District.
Agenda December 11, 2008 Learning by Doing Chapters 1-3 activities Break What does the data tell us? ITBS/ITED results SIP Goals Data Questions & Planning.
Summative Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
10+ Ways to Analyze Data Presenter: Lupe Lloyd Lupe Lloyd & Associates, Inc.
Student Learning Objectives (SLO) Dr. Cathleen Cubelic
PAYS FOR: Literacy Coach, Power Hour Aides, LTM's, Literacy Trainings, Kindergarten Teacher Training, Materials.
Value Added Assessment RAD Reading Assessment Teacher Moderation Greg Miller Supervisor of Assessment Lynda Gellner Literacy Consultant Juanita Redekopp.
Data Analysis Training PDP Goals & Strategies September, 2012.
Preparing for Cycle III School and District Accountability Ratings and AYP Determinations Information Sessions August 26 & 27, 2004 Juliane Dow, Associate.
1 LAUSD Mathematics Periodic Benchmark Assessments Using Data to Inform Instruction.
Assessment Literacy Interim Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
___________________________ NCDPI Division of Accountability Services/North Carolina Testing Program North Carolina Alternate Assessment Academic Inventory.
Standards Aligned System April 21, 2011 – In-Service.
Module 4: Establishing a Data-based Decision- making System.
1 Grade 12 Subject Specific Ministry Training Sessions SCIENCE ASSESSMENT & EVALUATION.
1 SESSION 3 Decision Rules for Standard 7 and the Summative Evaluation
WorkKeys Internet Version Training Groups and Batch Loading Mary Lewis Systems Solutions Manager November 2013.
1 Literacy PERKS Standard 1: Aligned Curriculum. 2 PERKS Essential Elements Academic Performance 1. Aligned Curriculum 2. Multiple Assessments 3. Instruction.
SEED – CT’s System for Educator and Evaluation and Development April 2013 Wethersfield Public Schools CONNECTICUT ADMINISTRATOR EVALUATION Overview of.
© 2017 SlidePlayer.com Inc. All rights reserved.