Dr. Deborah A. Brady Ribas Associates, Inc.. First Hour  Overview of District Determined Measures  The Timeline  Quality Assessments  Tools from DESe.

Slides:



Advertisements
Similar presentations
Session Learning Target You will gain a better understanding of identifying quality evidence to justify a performance rating for each standard and each.
Advertisements

Dr. Tracy Tucker Thomas Coy Arkansas Department of Education
District Determined Measures
Massachusetts Department of Elementary & Secondary Education 1 Welcome!  Please complete the four “Do Now” posters.  There are nametags on the tables:
Student Learning Targets (SLT) You Can Do This! Getting Ready for the School Year.
Overview of the New Massachusetts Educator Evaluation Framework Opening Day Presentation August 26, 2013.
Watertown Public Schools Assessment Report 2009 Ann Koufman-Frederick & WPS Administrative Council School Committee Meeting December 7, 2009 Part I MCAS,
District Determined Measures aka: DDMs What is a DDM? Think of a DDM as an assessment tool similar to MCAS. It is a measure of student learning, growth,
Student Growth Measures in Teacher Evaluation
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
 Reading School Committee January 23,
Educator Evaluation System Salem Public Schools. All DESE Evaluation Information and Forms are on the SPS Webpage Forms may be downloaded Hard copies.
SLO Process A process to document a measure of educator effectiveness based on student achievement of content standards.
Targeted Efforts to Improve Learning for ALL Students.
A Collaborative Approach to Planning for DDM’s Kristan Rodriguez, Ph.D Chelmsford Public Schools.
Student Learning Objectives (SLOs) Upper Perkiomen School District August 2013.
Model Curriculum Maps 2012 Curriculum Summit November 13 – 14, 2012 Julia Phelps and Karen White Raising the Rigor of Teaching and Learning.
The Massachusetts Framework for Educator Evaluation: An Orientation for Teachers and Staff October 2014 (updated) Facilitator Note: This presentation was.
Oregon Framework Focus on Student Learning & Growth Goals
Virginia Teacher Performance Evaluation System
Title IIA: Connecting Professional Development with Educator Evaluation June 1, 2015 Craig Waterman.
Today’s website:
Broome-Tioga BOCES Network Team… the ENTIRE team! (Pat Walsh, Barb Phillips, Jennifer Dove)
Tennessee’s Common Core Standards South Gibson County High School Parent Information Session.
Honors Level Course Implementation Guide [English Language Arts]
North Reading Public Schools Educator Evaluation and District Determined Measures: Laying the Foundation Patrick Daly, Ed.D North Reading Public Schools.
Instruction aligned to Iowa Core: What does it look like? #CCSS.
English Language Arts Program Update Dr. Lisa M. White School Committee Meeting January 24, 2011.
Department of Elementary and Secondary Education July, 2011
DDMs for School Counselors RTTT Final Summit April 7, 2014 Craig Waterman & Kate Ducharme.
Stronge Teacher Effectiveness Performance Evaluation System
 Reading Public Schools Staff Presentations March 30, 2012.
Elementary & Middle School 2014 ELA MCAS Evaluation & Strategy.
District Determined Measures aka: DDMs The Challenge: The Essential Questions: 1.How can I show, in a reliable and valid way, my impact on students’
Kindergarten Individual Development Survey (KIDS) District 97 pilot involvement December 11, 2012.
Introduction: District-Determined Measures and Assessment Literacy Webinar Series Part 1.
DDM Part II Analyzing the Results Dr. Deborah Brady.
District-Determined Measures Planning and Organizing for Success Educator Evaluation Spring Convening: May 29, 2013.
Educator Evaluation Spring Convening Connecting Policy, Practice and Practitioners May 28-29, 2014 Marlborough, Massachusetts.
The New Massachusetts Principal Evaluation
Twilight Training October 1, 2013 OUSD CCSS Transition Teams.
MVSA Ron Noble - ESE October 16, 2013 DDMs: Updates and Discussion.
Special Educator Evaluation Matt Holloway Educator Effectiveness Specialist.
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
A Primer. What Are the Common Core State Standards? The Common Core State Standards identify what students need to know and be able to do in each grade.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
District Determined Measures Diman Regional Vocational School Dr. Deborah Brady.
2013 MASS Executive Institute. More Than a Decade of Progress: Grade 10 MCAS % proficient or higher 2.
Dr. Deborah A. Brady Ribas Associates, Inc..  Please create a name tag or a “name tent” with your first name and school or department.  Read the Table.
ASSOCIATION OF WASHINGTON MIDDLE LEVEL PRINCIPALS WINTER MEETING -- JANUARY 24, 2015 Leveraging the SBAC System to Support Effective Assessment Practices.
Bridge Year (Interim Adoption) Instructional Materials Criteria Facilitator:
ANALYSIS AND ATTRIBUTES OF APPROPRIATE ASSESSMENTS Coastal Carolina University.
Guidelines for Uniform Performance Standards and Evaluation Criteria for Teachers Virginia Department of Education Approved April 2011.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Springfield Effective Educator Development System (SEEDS)
Summary of Assessments By the Big Island Team: (Sherry, Alan, John, Bess) CCSS SBAC PARCC AP CCSSO.
Identifying Assessments
Braintree Public Schools Six Elementary Schools MCAS Improvement Intervention Robert M. Belmont, Jr. Director of Special Services Braintree Public Schools.
Assessment Workshop Creating and Evaluating High Quality Assessments Dr. Deborah Brady.
 Teachers 21 June 8,  Wiki with Resources o
Student Growth Goals for Coaching Conversations. Requirements for Student Growth Goals Time line Reference to an Enduring Skill Proficiency and Growth.
July 11, 2013 DDM Technical Assistance and Networking Session.
Student Growth Goals Work Session. Goals for the day 1.Develop questions to ask teachers to determine if the SGG meets the criteria established in the.
Dr. Deborah A. Brady Ribas Associates, Inc..  Please create a name tag or a “name tent” with your first name and school or department.  Read the Table.
Somers Public Schools Building and Departmental Goals
Instructional Practice Guide: Coaching Tool Making the Shifts in Classroom Instruction Ignite 2015 San Diego, CA February 20, 2015 Sandra
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Common Core Standards Building Curriculum Units
Connecticut Core Standards for English Language Arts & Literacy
Presentation transcript:

Dr. Deborah A. Brady Ribas Associates, Inc.

First Hour  Overview of District Determined Measures  The Timeline  Quality Assessments  Tools from DESe  Resources  Rubrics  Core Content Objectives Second Hour  Job alike groups and departments work together  Beth Pratt and Deb Brady will go from group to group  Product: Facilitator hand in any unanswered questions or ???

 By the end of the workshop, participants will: 1. Understand the quality expectations and assessment criteria for DDM assessments 2. Begin to draft a schedule for this year for your team or department 3. Begin the process of developing DDMs by (if there is time)  Using the Quality Tracking Tool on at least one possible DDM  Using the Educator Alignment tool to consider the local assessment needs 4. or send hardcopy of your group’s meeting minutes  Include progress  Remaining questions  What you will need to be successful

 DESE Tools  Quality Tracking Tool (Excel file)  Educator Assessment Tool (Excel file)  Core Curriculum Objectives (CCOs)  Example Assessments (mainly commercial; some local)  Model Curriculum Units with  Rubrics (Curriculum Embedded Performance Assessments)  Rubrics: Cognitive Rigor Matrices: Reading, Writing, Math, Science  Research  NY and NYC  Achieve.org, PARCC, and many others

SY 2014SY 2015SY 2016 September: Pilot Plan for least 5 DDMs December: Implementation Extension Request Form Pilot 5 DDMs (at least) The scores do not count June: Final Plan for assessing all teachers with at least 2 DDMs Collect first year’s data on DDMs for all educators Except waivered areas Collect Data second year of data for all educators Issue Student Impact Ratings for all except waived grades/courses/subjects

Pilot Year SY2014 SEPTEMBER DESE received B-R’s Plan for  Early grade literacy (K-3)  Early grade math (K-3)  Middle grade math (5-8)  High school “writing to text” (PARCC multiple texts)  PLUS one more non-tested course, for example:  Fine Arts  Music  PE/Health  Technology  Media/Library  Or other non-MCAS growth courses including grade 10 Math and ELA, Science DECEMBER: Implementation Extension Request Form for specific courses in the JUNE PLAN BY JUNE PLAN for all other DDMs must be ready for implementation in year 2 SY2015 At least one “local” (non-MCAS) and two measures per educator The scores will not count for those who pilot DDMs in 2014.

SY 2015  All professional personnel will be assessed with 2 DDMs, at least one local:  Guidance  Principals, Assistant Principals  Speech Therapists  School Psychologists  Nurses  All teachers not yet assessed; general and special education YEAR 2 The scores will count as the first half of the “impact score.” The scores will count as the first half of the “impact score” with the waivered courses as the only exception

SY2016 “Impact Ratings” will be given to all licensed educational personnel and sent to DESE  Two measures for each educator  At least one local measure for everyone  Some educators will have two local measures  Locally determined measures can include Galileo, DRA, MCAS-Alt  The MCAS Growth Scores can be one measure  The average of two years’ of scores  And a two-year trend Year 3 “Impact Ratings” Are based upon two years’ growth scores for two different assessments, one local.

DESE is still rolling out the evaluation process and District Determined Measures SY2013 Teacher Evaluation MA Model for all RTTT districts SY2014 Pilot DDMs for some courses (5) Due June Plan for DESE for all teachers. SY 2015 Assess all educators: administrators, specialists, all teachers, guidance, school psychologists, nurses Except Waivered grades/subjects/ or courses SY 2016 Use 2 years’ of data from 2 assessments (at least one local) as part of the evaluation system, “impact Ratings” Trends Except Year 1 of Waivered grades/subject /courses

From the Commissioner: “Finally, let common sense prevail when considering the scope of your pilots. “I recommend that to the extent practicable, districts pilot each potential DDM in at least one class in each school in the district where the appropriate grade/subject or course is taught. “There is likely to be considerable educator interest in piloting potential DDMs in a no-stakes environment before year 1 data collection commences, so bear that in mind when determining scope.”

Everyone earns two ratings Exemplary Proficient Needs Improvement Unsatisfactory High Moderate Low Massachusetts Department of Elementary and Secondary Education 11 Summative Performance Rating Impact Rating on Student Performance *Most districts will not begin issuing Impact Ratings before the school year.

Summative Rating Exemplary 1-yr Self- Directed Growth Plan 2-yr Self-Directed Growth Plan Proficient Needs Improvement Directed Growth Plan Unsatisfacto ry Improvement Plan LowModerateHigh Rating of Impact on Student Learning Massachusetts Department of Elementary and Secondary Education 12 Impact Rating on Student Performance

/ 25 SGP 230/ 35 SGP 225/ 92 SGP

Types  On Demand (timed and standardized)  Mid-Year and End-of-Year exams  Projects  Portfolios  Capstone Courses  Unit tests Formats  Multiple choice  Constructed response  Performance (oral, written, acted out)

 MCAS Growth Scores can serve as one score for (ELA, Math 4-8; not 3, not HS)  MCAS Growth Scores must be used when available, but all educators will have 2 different measures  The MA Model Units Rubrics can be used (online for you)  Galileo  BERS-2 (Behavioral Rating Scales)  DRA (Reading)  Fountas and Pinnell Benchmark  DIBELS (Fluency)  MCAS-Alt  MAP

Why (beyond evaluation impact) determining these measures is important to every educator

Assessment Quality  Validity  Reliability  Rigor  Scoring Guides  Inter-rater reliability  You will receive tools for these areas today

 Calibration of Scorers  Developing assessment protocols  Are all assessments of equally appropriate rigor K-12? Integrity of scores  “Assessment creep”  Training assessors  Time  Tabulating growth scores from student scores  Organizing and storing scores

Capitalize on what you are already doing  Writing to text 9-12? K-12?  Research K-12? Including Specialists?  Art, Music, PE, Health present practices  Math—one focus K-12?  “Buy, borrow, or build your own” DESE

Tools to assess Alignment Tools to assess Rigor Tools to assess the quality of student work

Alignment  Alignment to Common Core, PARCC, and the District Curriculum  Shifts for Common Core have been made:  Complex texts  Multiple texts  Argument, Info, Narrative  Math Practices  Depth over breadth Rigor

Reliability  Internal Consistency  Test-retest  Alternate forms/split half  Inter-rater reliability  0 to 1 rating for Reliability  None to 100% Validity  Are you measuring what you intend to assess  Content (=curriculum)  Consequential Validity— good or bad impact  Does this assessment narrow the curriculum?  Relationships (to SAT, to grades)  Correlation measurement  -1 to +1 ratings

LastFirstGradeCours e DDM 1 DDM 2 DDM 3 SmithAbby1ELADRAF&PBenchmark SmithAbby1MathUnit TestGalil eo JonesBob4ELAMCAS Growth UnitBenchmarkGalileo JonesBob4MathMCAs Growth UnitBenchmarkGalileo AdamsJohn9ELAWTTUnit AdamsJohn10ELAWTTUnit AdamsJohn11Huma nities WTT Unit Cambr idge AnneAlg 1MathWTT Unit Cambr idge AnneGeomMathWTTUnit Washi ngton GregMixedArt 1WTTUnitPortfolio

“Borrow, Buy, or Build ”  PRIORITY: Use Quality Tool to Assess Each Potential DDM to pilot this year for your school (one district final copy on a computer)  CCOs will help if this is a District-Developed Tool  If there is additional time, Use Educator Assessment Tool to begin to look at developing 2 assessments for all educators for next year

 Is the measure aligned to content?  Does it assess what is most important for students to learn and be able to do?  Does it assess what the educators intend to teach ? (VALIDITY) 27

ELA-Literacy — 9 English Assessment Hudson High School Portfolio Assessment for English Language Arts and Social Studies Publisher Website/Sample Designed to be a measure of student growth over time in high school ELA and social science courses. Student selects work samples to include and uploads them to electronic site. Includes guiding questions for students and scoring criteria. Scoring rubric for portfolio that can be adapted for use in all high school ELA and social science courses. Generalized grading criteria for a portfolio. Could be aligned to a number of CCOs, depending on specification of assignments.

 Buy, Borrow, Build  Each sample DDM is evaluated Hudson’s Evaluation: Designed to be a measure of student growth over time in high school ELA and social science courses. Student selects work samples to include and uploads them to electronic site. Includes guiding questions for students and scoring criteria. Scoring rubric for portfolio that can be adapted for use in all high school ELA and social science courses. Generalized grading criteria for a portfolio. Could be aligned to a number of CCOs, depending on specification of assignments.  Many are standardized assessments

 Is the measure informative?  Do the results of the measure inform educators about curriculum, instruction, and practice?  Does it provide valuable information to educators about their students?  Does it provide valuable information to schools and districts about their educators ? 31

 Pre-Test/Post Test  Repeated Measures (running records)  Holistic Evaluation (portfolio)  Post-Test Only (only when assessment lacks norm like AP use as baseline) 32

For Assessing Rigor and Alignment 1. Daggett’s Rigor/Relevance Scale 2. DESE’s Model Curriculum (Understanding by Design) 3. Curriculum Embedded Performance Assessments from MA Model Curriculum 4. PARCC’s Task Description 5. PARCC’s Rubrics for writing

Topic development: The writing and artwork identify the habitat and provide details Little topic/idea development, organization, and/or details Little or no awareness of audience and/or task Limited or weak topic/idea development, organization, and/or details Limited awareness of audience and/or task Rudimentary topic/idea development and/or organization Basic supporting details Simplistic language Moderate topic/idea development and organization Adequate, relevant details Some variety in language Full topic/idea development Logical organization Strong details Appropriate use of language Rich topic/idea development Careful and/or subtle organization Effective/rich use of language Evidence and Content Accuracy: writing includes academic vocabulary and characteristics of the animal or habitat with details Little or no evidence is included and/or content is inaccurate Use of evidence and content is limited or weak Use of evidence and content is included but is basic and simplistic Use of evidence and accurate content is relevant and adequate Use of evidence and accurate content is logical and appropriate A sophisticated selection of and inclusion of evidence and accurate content contribute to an outstanding submission Artwork; identifies special characteristics of the animal or habitat, to an appropriate level of detail Artwork does not contribute to the content of the exhibit Artwork demonstrates a limited connection to the content (describing a habitat) Artwork is basically connected to the content and contributes to the overall understanding Artwork is connected to the content of the exhibit and contributes to its quality Artwork contributes to the overall content of the exhibit and provides details Artwork adds greatly to the content of exhibit providing new insights or understandings

 New York State and New York City examples  Portfolio (DESE Approved from Hudson PS)  Connecticut: Specific tasks (Excellent for the Arts, Music)  PARCC question and task prototypes m-task-prototypes m-task-prototypes

Purpose  Discuss possible assessments  Consider what you need to accomplish this year using Schedule and Checklist  Use Quality Tracking Tool on one assessment to understand how it supports your district, school, department  Look at Educator Alignment tool to consider the “singletons” that may need to be addressed in your district, school, department Product or hard copy to Beth Pratt with minutes of your group’s meeting that may consider or be working on  Assessments that you are working on  Next steps  What you need to be successful

1. Measure growth 2. Employ a common administration procedure 3. Use a common scoring process 4. Translate these assessments to an Impact Rating 5. Assure comparability of assessments (rigor, validity). 40