2006-2010 Performance Standards for Community Adult Learning Centers Donna Cornellier, MASSACHUSETTS DEPARTMENT OF EDUCATION.

Slides:



Advertisements
Similar presentations
ABE Policy, Accountability & the NRS Summer Institute 2011.
Advertisements

COMMON MEASURES Presented by: Phil Degon Workforce Administration September 2006.
COMMON MEASURES. Training Objectives Review Common Measures data sources Discuss Reports and Desk Aids.
Completing the Classroom Teacher and Non-Classroom Teacher Evaluations for Presented by: The Office of Talent Development Employee Evaluations.
Freeman Elementary…state of the school.. Freeman Building Wide Goals Reading:  By May 2013, the percentage of K-2 students achieving benchmark standards.
Building Level Benchmark Data This represents the percent of students who demonstrated the following proficiency levels on benchmark assessments. AP-Advanced.
Action Research Opportunity Or Research Based Action.
Using MAP for College and Career Readiness
Overview and Discussion of NRS Changes. Massachusetts Department of Elementary and Secondary Education 2 NRS Changes for FY13 AGENDA  Review changes.
Joe Serna, Jr. Charter School Annual Report Lodi Unified School District Board of Education November, 2012 Maria G. Cervantes, Principal.
Promoting Equity and Access Lynn Boyer West Virginia Department of Education.
The REEP Pre and Post Writing Assessment (RWA)
Assessment Policies 1 Implementation and Monitoring American Institutes for Research February 2005.
Intro to NRS Data Diving Mary A. Gaston, Ed.D. & Jennifer Cooper-Keels February 4, 2011.
11 STUDENT SUCCESS 2020 Felicia Patterson Vice President, Learner Support Services Anne Arundel Community College ACCT New and Experienced Trustees Governance.
January  Our original research question(s) began as: ◦ For students meeting the criteria for NRS inclusion who were pre and post-tested, what appears,
Assessment Targeted Instruction Educational Gain.
Teacher Evaluation Training June 30, 2014
Assessment Policy. Reporting Student Data in AERIS All student data must be entered into AERIS by the 15 th and approved by the 22 nd of each month for.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
The National Reporting System: Foundations Technical College System of Georgia Office of Adult Education.
K-12 Student Performance and Efficiency Commission July 18, 2014 School Year Data.
1 Literacy and Numeracy Gains Webinar February 3, :00 am - 11:00 am.
Title III Accountability. Annual Measurable Achievement Objectives How well are English Learners achieving academically? How well are English Learners.
Oregon Pathways for Adult Basic Skills Transition to Education and Work (OPABS) Initiative.
What is the SQRP?  The School Quality Rating Policy (SQRP) is the Board of Education’s policy for evaluating school performance.  It establishes the.
Orientation to 2008 Regional Training: Building and Sustaining Quality in NRS Data National Webinar June 24, 2008 Larry Condelli Steve Coleman.
McLendon and Polis1 An Administrator’s Guide to Assessment: A Menu of Assessment Options for MAERS and Instructional Guidance.
Understanding the NRS Rosemary Matt NYS Director of Accountability.
CHOLLA HIGH MAGNET SCHOOL Plc Workshop
Superintendent’s Panel on Excellence in Adult Education.
FY14 NRS, Updates and Reminders Technical College System of Georgia Office of Adult Education Intake Assessment Form ■ Assessment Policy ■ National Reporting.
NRS and Data Collection Part 2. NRS and Data Collection Part 1 and 2  Why? Make sure all programs understand what needs to be collected and definitions.
Report Samples The MAERS Development Team 1. Data Management Report Samples Characteristic Reports Participant Characteristics (AEPARTCHAR) Instructional.
The Value of Data The Vital Importance of Accountability American Institutes for Research February 2005.
TABE Out-of-Range Scores April Objectives Address the issues surrounding Out-of-Range TABE scores Answer questions submitted by providers Provide.
NRS JEOPARDY! The Adult Education Community’s Favorite Quiz Show.
Using Data Effectively ABE Directors’ Meeting October 9th and 10th, 2002.
Technical Assistance Workshop Office of Adult Education January 16,
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Capacity Development and School Reform Accountability The School District Of Palm Beach County Adequate Yearly Progress, Differentiated Accountability.
Cambrian School District September 17, 2015
Adult Education Assessment Policy Effective July 1 st, 2011.
Carol Stewart Kennesaw State University. Purpose  To conduct a comprehensive needs assessment of the school that addresses academic areas of math and.
Measuring Turnaround Success October 29 th, 2015 Jeanette P. Cornier, Ph.D.
December 2015 STATE OF SCHOOL REVIEW Everett Public Schools Compiled by the Center for Educational Effectiveness.
Literacy-Based Promotion Act & 3 rd Grade Summative Assessment Parent Information Night September 29, 2015.
Understanding the NRS Rosemary Matt NYS Director of Accountability.
Understanding the National Reporting System Rosemary Matt NYS Director of Accountability NRS.
Documenting California Program and Learner Outcomes to Federal and State Policymakers for the ACSA presentation on September 25, 2003 and for the California.
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
DAYTONA STATE COLLEGE, SCHOOL OF ADULT EDUCATION Cross Training Update.
TOPSpro Special Topics TOPSpro Special Topics Data Detective I: Tracking Your Pre- and Post-test Results.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
1 WIOA Review. Try This Cross your arms 2 Now cross your arms the other way. How does it feel? 3.
Performance Goals Samples (Please note, these goals are not proficient- they are for training purposes) What do you think?
Brian Frazier Talent Investment Agency Office of Adult Education
ABE Policy & Accountability
TOPSpro Special Topics
Ready to Read Ready to Succeed
New York State Report Cards
Implementing the Specialized Service Professional State Model Evaluation System for Measures of Student Outcomes.
Specifications Used for School Identification Under ESSA in
Student Mobility and Achievement Growth In State Assessment Mohamed Dirir Connecticut Department of Education Paper presented at National Conference.
School Improvement Ratings Rule 6A , F.A.C.
Introduction to The Many Uses of Data
Growth Last updated: 08/20/09.
Office of Adult Education Instructional Services Team
NRS Training: Reporting Student Performance on Table 4 - ABE Scenarios
Mayra Perez, Ed. D. October 10, 2016
Presentation transcript:

Performance Standards for Community Adult Learning Centers Donna Cornellier, MASSACHUSETTS DEPARTMENT OF EDUCATION

Performance Measure: Attendance Definition: Total number of rate-based attended student hours divided by the total number of planned student hours Exclusions: Non-rates based classes Standard: Programs ensure that students attend between 66% and 76% of total planned student hours. Purpose: Promotes effective outreach, screening, orientation, assessment, placement procedures Promotes instruction that is relevant to students’ needs

Performance Measure: Attendance Benchmarks Number of points Cut Points for Attendance Advanced377% and above Meets Standard 266%-76% Needs Improvement 155%-65% Remedial Action 054% and below State Averages in Community Adult Learning Centers  FY %  FY %  FY %  FY %  FY %  FY % Find this data at SMARTT Cognos ReportNet Adhoc Reports>> Performance Standards Reports for Community ALC

Performance Measure: Average Attended Hours Definition: Total number of actual attended student hours divided by the total number of students (Attended hours includes rates based, non-rates based, and volunteer match hours.) Standard: Programs ensure that students attend between 117 and 131 average attended hours per year. Purpose: Promotes student persistence and learner gain Measures intensity of students’ instructional time

Performance Measure: Average Attended Hours Benchmarks Number of points Cut Points for Attendance Advanced377% and above Meets Standard266%-76% Needs Improvement 155%-65% Remedial Action 054% and below State Averages in Community Adult Learning Centers  FY  FY  FY  FY  FY  FY Find this data at SMARTT Cognos ReportNet  Adhoc Reports>> Performance Standards Reports for Community ALC

Performance Measure: Pre and Post Test Percentage Definition: Total number of students who are pre and post tested divided by the total number of students enrolled in a program for the year (based on student’s primary assessment) Exclusions: Students with < 12 hours attendance Students with an initial scale score in TABE Language > 584 Students with an initial scale score in MAPT >599 Students with an intake date after 4/1 Standard: Programs ensure that between 66% and 76% of eligible students are pre- and post-tested. Note: If a program post tests less than 50% of eligible students, performance points for the Learner Gains performance standard are reduced by 50%. Purpose: Promotes retention of students so that students remain in programs long enough to achieve learning gains and goals

Performance Measure: Pre and Post Test Percentage Benchmarks Number of points Cut Points for Attendance Advanced377% and above Meets Standard 266%-76% Needs Improvement 155%-65% Remedial Action 054% and below State Averages in Community Adult Learning Centers  FY %  FY % Find this data at SMARTT Cognos ReportNet  Adhoc Reports>> Performance Standards Reports for Community ALC

Performance Measure: Learner Gains Definition: Total percent of students who demonstrate learner gain on TABE, MAPT, REEP, or BEST Plus assessments (based on student’s primary assessment) Significant learner gain for each assessment is defined as: TABE:27 scale score points REEP:.4 scale score points BEST Plus:33 scale score points Other: 1 GLE (grade level equivalent) for Beginning Literacy Notes: Beginning in FY 2007, the TABE for ABE reading and math was replaced by the MAPT. MAPT and TABE Level L (literacy) data will be reviewed by UMass to determine significant learner gain. If a program post tests less than 50% of eligible students, performance points awarded for Learner Gains are reduced by 50% Standard: Programs ensure that between 47% and 56% of students demonstrate significant gain as defined in the table above. Purpose: To capture the learning gains that students achieve

Performance Measure: Learner Gains Benchmarks Number of points Cut Points for Attendance Advanced957% and above Meets Standard 647%-56% Needs Improvement 336%-46% Remedial Action 035% and below State Averages in Community Adult Learning Centers  FY %  FY % Find this data at SMARTT Cognos ReportNet  Adhoc Reports>> Performance Standards Reports for Community ALC

Using Your Performance Standards Data for Program Improvement In this workshop, we will review the data for small, medium, and large programs that have met or exceeded the performance standards for learning gains, pre/post percentage, average attended hours, and attendance. The workshop will focus on ways to analyze current performance standards data and determine ways to use the data for program improvement. Participants will be asked to break into small groups and to focus first on retention (e.g. average attended hours) and look at the programs that have met or exceeded the performance target in that area. Then participants will be asked to review data for the other performance areas for the same programs and fill in the chart provided at the workshop. Each group will be asked to answer the following questions:

Using Your Performance Standards Data for Program Improvement 1) When reviewing all the performance data, what conclusions can be drawn? How do the performance standards data relate to each other ---or not? What differences do you see between small, medium, and large programs? For this exercise, use this definition for small, medium, and large programs. Small Size Programs: 125 or less Students Medium Size Programs: Students Large Size Programs: > 300 Students

Using Your Performance Standards Data for Program Improvement 2) For the programs that met or exceed most of the performance target areas, what might the programs be doing well that can be shared with others? What good practices do you think need to be in place? 3) What recommendations can be given for program improvement in each area: Learning Gains: Pre Post Testing: Average Attended Hours: Attendance: Student Goals:

LET’S LOOK AT RETENTION Large Size Programs: > 300 Students Project Name # Students Average Attended Hours Pre- Post-% Percent Learning Gains Percent Attendance Goal Attainment NRS Level Completion Program Program Program Medium Size Programs: Students Program Program Program Small Size Programs: 125 or less Students Program Program Program

FY06 Performance Standard Report for Program Specialist Implications for FY2007 Continuous Improvement Plans Program NameAttendance % Average Attendance: # hrs/student Pre Post %Learning Gains % Program 180% Advanced Advanced86% Advanced48% Meets Program 283% Advanced Advanced91% Advanced49% Meets Program 3 58% Needs Improvement Advanced67% Meets47% Meets Program 484% Advanced Advanced80% Advanced62% Advanced Program 573% Meets Meets73% Meets 31% REMEDIAL ACTION Program 6 57% Needs Improvement Needs Improvement 36% REMEDIAL ACTION 36% Needs Improvement Program 778% Advanced Advanced75% Meets53% Meets Program 8 52% REMEDIAL ACTION REMEDIAL ACTION 52% REMEDIAL ACTION 41% Needs Improvement Program 970% Meets REMEDIAL ACTION 62% Needs Improvement 58% Advanced Program 1070% Meets Meets78% Advanced57% Advanced Program 1177% Advanced Advanced70% Meets 42% Needs Improvement Program 1269% Meets Needs Improvement 76% Meets60% Advanced