Race To The Top (RttT) MSDE Division of Accountability and Assessment Data Systems (DAADS) Maryland Public Schools: #1 in the Nation Three Years in a Row.

Slides:



Advertisements
Similar presentations
Understanding Student Learning Objectives (S.L.O.s)
Advertisements

Data-Driven Decision Making: Essential Conditions for Success.
The Readiness Centers Initiative Early Education and Care Board Meeting Tuesday, May 11, 2010.
FRANKLIN PUBLIC SCHOOLS SCHOOL COMMITTEE MAY 27, 2014 Massachusetts Kindergarten Entry Assessment (MKEA)
Using Data to Support Statewide initiatives centered on Student Achievement A look at publically available data for use by RSA’s, Districts, and schools.
The Need To Improve STEM Learning Successful K-12 STEM is essential for scientific discovery, economic growth and functioning democracy Too.
MARYLAND’S REFORM PLAN RACE TO THE TOP. This presentation is a product of the Maryland State Department of Education 03/03/10 American Recovery and Reinvestment.
Goals of Title II, Part D of No Child Left Behind The primary goal of this part of NCLB is to improve student academic achievement through the use of technology.
Smarter Balanced Assessment Consortium. What is the SMARTER Balanced Assessment Consortium (SBAC)? The SMARTER Balanced Assessment Consortium (SBAC) is.
On The Road to College and Career Readiness Hamilton County ESC Instructional Services Center Christina Sherman, Consultant.
Individualized Learning Plans A Study to Identify and Promote Promising Practices.
A Systemic Approach February, Two important changes in the Perkins Act of 2006 A requirement for the establishment of Programs of Study A new approach.
Transitioning to the Common Core Common Core Academy - Summer 2011.
New Era MMSR- Maryland Model for School Readiness ends. Maryland’s Early Learning Framework begins… begins transition:
1 Presentation to USED Review Panel August 10, 2010 North Carolina Race to the Top Proposal R e d a c t e d.
DISTRICT IMPROVEMENT PLAN Student Achievement Annual Progress Report Lakewood School District # 306.
Common Core State Standards and Partnership for Assessment of Readiness for College and Careers (PARCC) Common Core State Standards and Partnership for.
Race to the Top Program Update January 30, State Funding 2.
Standards Aligned System April 21, 2011 – In-Service.
Using Longitudinal Data to Improve Student Achievement US Chamber of Commerce Institute Aimee Rogstad Guidera February 23, 2010.
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
The Five New Multi-State Assessment Systems Under Development April 1, 2012 These illustrations have been approved by the leadership of each Consortium.
ABLE State Update Jeff Gove, State ABLE Director.
Consortia of States Assessment Systems Instructional Leaders Roundtable November 18, 2010.
Student Learning Objectives 1 Phase 3 Regional Training April 2013.
Technology Leadership
Kit Goodner Assistant Deputy Commissioner for Data Systems Division of Accountability, Research & Measurement June 2011.
Arizona Department of Education Superintendent John Huppenthal Mark Masterson, CIO Pamela Smith, AELAS Program Director Mark Svorinic, AZ-SLDS Program.
Leveraging Race to the Top to Maximize the Use of Data To Ensure College & Career Readiness Aimee R. Guidera Achieve ADP September 10, 2009.
Statewide Initiatives & Lane County School Districts A conversation with Lane County Districts on statewide data initiatives and the implications for Lane.
Proficiency Delivery Plan Strategies Curriculum, Assessment & Alignment Continuous Instructional Improvement System ( CIITS) New Accountability Model KY.
Using Data to Improve Student Achievement Aimee R. Guidera Director, Data Quality Campaign National Center for Education Accountability April 23, 2007.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction California Measurement of Academic Performance and Progress.
TOM TORLAKSON State Superintendent of Public Instruction National Center and State Collaborative California Activities Kristen Brown, Ph.D. Common Core.
Elementary & Middle School 2014 ELA MCAS Evaluation & Strategy.
Race to the Top (RTTT) Overview of Grant Competition Goals and Requirements 1.
26th Annual Management Information Systems [MIS] Conference February 14, 2013 Washington, DC Common Education Data Standards (CEDS) Supporting Assessment.
Module 3: Unit 1, Session 3 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 1, Session 3.
Key System Features and Next Steps. Features: Computer Adaptive Testing Adaptive assessment provides measurement across the breadth of the Common Core.
Moving Beyond Compliance: Four Ways States Can Support Districts and Local Data Use 2012 MIS Conference Dan Domagala, Colorado Department of Education.
Assessing The Next Generation Science Standards on Multiple Scales Dr. Christyan Mitchell 2011 Council of State Science Supervisors (CSSS) Annual Conference.
MARYLAND’S REFORM PLAN RACE TO THE TOP.  Maryland’s initiatives are about reform, not simply the money.  Reform efforts will continue with or without.
A state-wide effort to improve teaching and learning to ensure that all Iowa students engage in a rigorous & relevant curriculum. The Core Curriculum.
Scaling Integration: A Vision for Collaboration across the SLV BOCES Curtis Garcia.
Florida Center for Reading Research: Mission & Projects Dr. Marcia L. Grek Council of Language Arts Supervisors, May 2003.
P-20 Statewide Longitudinal Data System (SLDS) Update Center for Educational Performance and Information (CEPI)
Measured Progress ©2011 Guide to the Smarter Balanced IT Architecture Connecticut Assessment Forum August 14, 2012.
Race to the Top General Assessment Session Atlanta, Georgia November 17, 2009 Louis M. (Lou) Fabrizio, Ph.D. Director of Accountability Policy & Communications.
EDUCATIONAL TECHNOLOGY IN THE ASIA PACIFIC REGION
Common Core State Standards An overview for Professional Development Leads March 8, 2010 Mary Russell, Region 3 Joyce Gardner, Region 8.
UNC Deans Council The North Carolina K-12 Digital Learning Transition Glenn Kleiman Friday Institute for Educational Innovation NC State University College.
Friday Institute Leadership Team Glenn Kleiman, Executive Director Jeni Corn, Director of Evaluation Programs Phil Emer, Director of Technology Planning.
February 28, 2012 Presented By: Eileen Rohan, Superintendent Sean Maher, Network Manager Katie Frank, White Hill Michael Bessonette, Brookside Upper Ron.
SBCUSD and Smarter Balanced Assessment Consortium Testing Assessment and Accountability, January 2013.
Advanced Classroom Technology State funding for implementing technology Professional development for 21 st Century Classrooms One-to-One Mobile Devices.
1 Update on Teacher Effectiveness July 25, 2011 Dr. Rebecca Garland Chief Academic Officer.
Long Range Technology Plan, Student Device Standards Secondary Device Recommendation.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Kansas Education Longitudinal Data System Update to Kansas Commission on Graduation and Dropout Prevention and Recovery December 2010 Kathy Gosa Director,
Getting Started with i-Ready®
Welcome to Home Base You can also re-emphasize that all of these benefits can be found using one login all in one place, saving teachers more time to focus.
Instructional Improvement System Why – How – What -- When
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Maryland’s Race to the Top Application
JET Education Services: Innovations in Teacher Support and Curriculum Development Presentation to the Care and Support for Teaching and Learning Regional.
21st Century Learning Environments Phase 1 Professional Development
Using Data to Improve Student Achievement Aimee R. Guidera
Implementation Guide for Linking Adults to Opportunity
Implementing Race to the Top
Presentation transcript:

Race To The Top (RttT) MSDE Division of Accountability and Assessment Data Systems (DAADS) Maryland Public Schools: #1 in the Nation Three Years in a Row

Develop the Overall Technology Infrastructure to Support RTTT Initiatives Data Dashboards Multimedia Platform Local Education Agency(LEA ) Information Technology (IT) Infrastructure upgrades Developing Test Item Bank System Acquiring Computer Adaptive Test System (CATS) Setting-up and Loading Test Item Bank MSA Growth Modeling Student Growth Educator Effectiveness System Formative Assessments Data Exchange Develop P20 and Workforce Data System Electronic Transcript System Computer Adaptive Systems – High School Projects Overview

Project Overview Major Deliverables Impact Develop the Overall Technology Infrastructure to Support RTTT Initiatives (P8/11)  To build an infrastructure of hardware and software that expands the existing business intelligence reporting and analysis system for both teachers and students.  Register and monitor over 80,000 students,teachers and school administrators in a secured environment.  Expand existing business intelligence servers and licenses to accommodate 800k students and 60k teachers.  Provide end user support and training for RttT related projects.  Provides teacher toolkit framework.  Provides single sign-on access to instructional tools to be used by teachers and students.  Promotes shared use of infrastructure tools that will result in economy of scale savings.  Creates uniform technology solutions that will improve interoperability of systems both within MSDE and LEA computer systems.

Data Dashboards (P9/27) Major Deliverables Impact  To build thirty six Effectiveness, Accountability and Performance (EAP) dashboards to provide visibility into student performance across the state.  Dashboards: 36 dashboards over 3 years, 12 per year  Reports: Various associated reports that will provide aggregate and detailed student performance data.  System modifications: such as the early childhood system to present data using the existing business intelligence platform that is operational as part of Maryland’s LDS.  Each Dashboard will provide information for data driven decision making.  The dashboard will provide single version of truth to policy makers and educational administrators.  The dashboards will provide data to teachers and students to improve student learning in the classroom. Project Overview

Multimedia Platform (P10/28) Major Deliverables Impact  To design Multimedia Projects which will provide a cost- effective, web-based multimedia training vehicle that train educators how to use data and the MLDS and MLDS-EAP systems for educational improvement.  The multimedia platform will host 40 multi-media data-usage training sessions within the RTTT production infrastructure  The dashboards and report tutorials will explain to users how to interpret the reports and help identify and promote evidence based solutions. Project Overview

Local Education Agency(LEA ) Information Technology (IT) Infrastructure upgrades (P11/29) Support Application and technical infrastructure Major Deliverables Impact  To provide the LEAs with the necessary Infrastructures for RttT data collection, processing, and reporting.  System upgrades  System integration  System replacement  Ensuring data integrity across systems  Increased capability to perform education reform data collection, data transfer, and reporting  Interoperability between the LEA systems with the MSDE systems. Project Overview

Developing Test Item Bank System P(17/32) Major Deliverables Impact  To implement a Test Item Bank System with a set of test item bank questions for reading/English, language arts and math in grades 3 – 11.  Provide teachers with a test item bank system to create interim, formative, and benchmark assessments  Provide students with options for taking assessments online or paper and pencil  Teachers will be able to make instructional changes based on results  School leaders will be able to make decisions based on results Project Overview

Acquiring Computer Adaptive Test System (CATS) Acquiring and implementing CATS P(18/33) Major Deliverables Impact  To Provide an online computer adaptive test system for reading/English, language arts and mathematics.  The computer adaptive test system that will be interoperable with the test item bank system.  Teacher will have diagnostic information about the abilities of their students  Teachers will be able to see growth from the pre test to the post test.  Students will be able to take these assessments online Project Overview

Setting-up and Loading Test Item Bank P(19/34) Implement an item load and set-up Item bank and CATS system Major Deliverables Impact  To load test items banks into the system so that teachers can use these as tools to assess student learning and growth.  The item bank system will be loaded with items from grades 3 – 11 in reading/English, language arts and mathematics.  A process for accepting items from LEAs will be developed.  Each LEA will have the ability to incorporate local items into the system.  A large repository of test items will be available to teachers.  Teachers will use these tools to assess student learning and growth. Project Overview

Computer Adaptive Systems – High School P(20/35) Identifying, acquisition and implementation of testing system. Major Deliverables Impact  To provide sets of WIFI computer adaptive testing units to high schools.  Computer adaptive units interoperable with the test item bank system and computer adaptive test system will be purchased for distribution to high schools  High School students will utilize these tools to take the online assessments Project Overview

MSA Growth Modeling P(27/46) Equating MSA for use on a Growth Model Major Deliverables Impact  The provide a recommended growth model(s) that can be used with MSA test score data.  The National Psychometric Council will conduct research on the various growth measure models currently being used nationally.  The Maryland Assessment Research Center for Education Success (MARCES) will analyze the various models using MSA test score data.  The NPC will recommend a growth model(s) that can be used with MSA test score data.  The recommended model(s) will be shared with the seven Local Education Agencies participating in the pilot educator evaluation system. Project Overview

Growth Model P(28/47) Develop and Implement a Statistical Model to Measure Student Growth Major Deliverables Impact  To develop, test and implement a new growth calculation predicated on student growth and educator effectiveness.  The intended outcome is to maximize student achievement through data knowledge of what skills students are learning, and to improve the statewide cadre of educators through retaining effective educators and increasing their effectiveness through professional development.  Two statistical approaches (student growth percentile and value matrices) developed and vetted by National Psychometric Council.  Year-long no-fault pilot with 7 LEAs underway. Project Overview

Educator Effectiveness System P(29/48) Develop and Implement an Educator Effectiveness System Major Deliverables Impact  To develop, procure and implement an Educator Evaluation System which will provide an effectiveness rating for each educator.  Procurement and implementation of evaluation systems  Evaluation tools and applications that can be used by LEAs to evaluate educators (teachers, principals, etc.)  Mechanism to transfer teacher ratings on the evaluations to MSDE for a total educator ratings.  Provide the 50% educator data information necessary to link with the other 50% data provided by the student growth model  The model will provide an effectiveness rating for each educator.  Addresses core educational reforms focused on implementing improved performance based educator evaluations  Better of use student performance data in the evaluation process.  Use data for incentives and professional development and provide data to support educator incentives Project Overview

Formative Assessments P(3/2) Major Deliverables Impact  To implement formative assessment system that enables teachers to assess student’s progress and take appropriate steps to improve it.  A comprehensive online formative assessment resources system that provides resources, tools, strategies, professional learning opportunities and guidance for classroom, school, and district level sustainable implementation.  Accelerated teacher learning and student achievement.  Increased capacity of teachers and school leaders to understand and effectively use curriculum-embedded formative assessment practices.  Expanded implementation of formative assessment practices within each local school system. Project Overview

Data Exchange P(12/60) Expansion to Longitudinal Data System (LDS) for Data Exchange Major Deliverables Impact  To provide a system for collecting and distributing data from the LEAs, Maryland State Department of Education, and Maryland higher education institutions for consolidation and distribution.  The MLDS Data Exchange will replace and reduce duplicate and costly data transfer and translation programming that would otherwise be performed if individual educational organizations write their own data send/receive data transfer programs.  This project provides an efficient way for the Maryland State Department of Education to share data with its 24 LEAs, the Maryland Statewide Longitudinal Data Center, and Maryland Higher Education Commission’s college data collection systems. Project Overview

Develop P20 and Workforce Data System P(13/61) Expand P12-LDS to encompass P13 and Workforce Major Deliverables Impact  Alignment of K12 curriculum and student readiness skills with post- secondary education institution expectations.  Identification of programs and policies that improve transition success between K12, higher education and workforce.  To create a new, higher education data warehouse that is integrated with the existing P-12 data warehouse and Maryland’s Department of labor, Licensing and Regulation’s labor workforce data warehouse.  Create a higher education data warehouse database  Develop a multi-agency student crosswalk table to enable linking data.  Design a process to load select data into the LDS data warehouse. Project Overview

Electronic Transcript System P(54/79) Implement Statewide Centralized Electronic Student Transcript System Major Deliverables Impact  To provide Resources to the LEAs by implementing the University of Maryland’s (USM) electronic transcript system.  Creation of the LEA transcript collaboration group.  Resources to implement electronic system and integrate into existing Student Information System (SIS). Project Overview

Project Completion