Presentation is loading. Please wait.

Presentation is loading. Please wait.

Central Mass. Readiness Center Part II Maintaining Quality Creating Cut Scores Preparing for June Report Preparing for 2015 Full Implementation Dr. Deborah.

Similar presentations


Presentation on theme: "Central Mass. Readiness Center Part II Maintaining Quality Creating Cut Scores Preparing for June Report Preparing for 2015 Full Implementation Dr. Deborah."— Presentation transcript:

1 Central Mass. Readiness Center Part II Maintaining Quality Creating Cut Scores Preparing for June Report Preparing for 2015 Full Implementation Dr. Deborah Brady Ribas Associates

2 Good Morning! DO NOW  Review today’s Agenda, Handout, and PowerPoint  Note cards for questions and/or exit slips  Have a cup of coffee  Wiki Page has today’s materials as well as Day 1 Materials

3 Agenda March 25, 2014  Where are you on your journey now? (Carousel, Living Likert)  Preparing for June report and for next year  Local Curriculum Quality: Checklist for DDM Indirect and Direct Measures  Good, Bad, Questionable Examples from local districts  Calibration Protocols and options  Juried Resources—Assessments and More elaborate Lessons with performance assessments  Documenting 2 DDMs per Educator (Documentation Form for 2 DDMs)  Scoring Assessments and Developing Local Cut Scores  Pre-Test, Post-Test Cut Score Example  Rubric Scoring in 100% system  Introduction to protocols for assessment: samples  Time to research, work with team, plan for June, plan for next year

4 The Steps Necessary to Get Ready for June Report and After Adapting present assessments Creating new assessments Writing to text for HS Developing and Piloting Assessments Alignment of Content Rigorous and appropriate expectations Approval of assessments Assessing Quality and Rigor Security Calibration of standards and of assessors Rubric quality Analysis of results: High-M-Low Growth Piloting 2 DDMs per educator JUNE REPORT Directions for teachers Directions for students Organizing for the actual assessments Storing, tracking the information 2015 Full Implementation Data storage Data Analysis L-M-H Growth Roster Verification Data team time Interpreting the results Student Impact

5 Carousel Walk and Living Likert Scale Carousel Walk  Take a marker for your team (or yourself if you are on your own) 1. Put a check to the right of each area that you have addressed (even partially) 2. Put an ! next to each step that you have some concerns about 3. Put a ? Next to any area that seems problematical or is unfamiliar to you  Visit each of the 5 “stages” of DDM development  h After viewing each stage, return to the stage that most represents you and/or your district Living Likert  Go to your “stage”  What are the barriers ? What are the strengths that your district has?  Discuss the barriers and strengths with the others at your stage.  Be prepared to report out as a whole group on a major concern

6 Preparing for the June Report Curriculum Quality Checklist Documenting 2 DDMs per educator

7 Name SubjectType # not Names Comment Drop down Source Grades

8

9

10

11 Glossary

12

13 Checklist for DDM Direct and Indirect Measures District Determines Expectations for Quality Check all Items that are completed DefinitionsPlease fill in this column as completely as you can  Course This information is necessary for June Report  Grade(s) of DDMGrade level(s) that this assessment will cover Check one  Direct Measure  Indirect Measure Direct Measure= Assessment of student growth Indirect Measure=Assessment of something that connects indirectly to student growth (attendance, for example)  Content Area List content area. Indirect measure include the job category/categories of those involved, for example, SLPs or guidance counselors  Alignment to State and/or District Standards For direct measures: List at least 2 standards that will be assessed so that this is a “substantial” assessment. For indirect measures, 1) What are the substantial, important, essential areas that you are assessing? 2) How does this indirect measure connect with student growth? Please list the two (or more) standards using standards language. 1. 2.  Rigor: Check the levels of Blooms that are assessed What are your district’s criteria? One district—all must have writing to text The original Bloom is the first word on the list. The new Bloom (all verbs) is the second. Note, in the new Blooms, Creating is on the highest level, above evaluating. Indirect Measures: NA; skip Types of Questions and Duration (below), but fill in the remaining categories. Check all that apply:  Knowledge, Remembering  Comprehension, Understanding  Application, Applying  Analysis, Analyzing  Synthesis, Creating  Evaluation, Evaluating

14 Check all that apply  Type(s) of questions  Multiple Choice, fill in, short answer (recall items from content area)  Multiple Choice, fill in, short answer (text dependent questions)  Open Response (short answer)  Essay (long response). Type:  Narrative  Informational Text  Argument with claims and proof  One text is read  Two texts are read  Performance Assessment (CEPA)  Other_______ (Fill in at right.) Indicate the percentage of the assessment for each question type, for example, multiple choice=50%; 2 open responses=50% (25% each) Multiple Choice _____% Open Response _____% Essay _____________% Performance Assessment: Describe briefly:  Duration of assessment Assessments can take place in a class period or over a period of days. Indirect Measures please fill this section out. Necessary for planning for full implementation.  When assessment(s) will take place Provide approximate month or window for assessment(s), for example, end of first trimester, September. Provide multiple dates if the assessment is a pre-post or is administered more than once. Indirect Measures please fill this out.  Components of assessment that are completed so far.  Directions to teacher for administering  Directions to students  Graphic organizers (optional)  The assessment  Scoring guide  Rubric  Security  Calibration protocol if this assessment has a rubric Indirect measures: Describe your proposed process; you will need to create many of these components  Rubric  Not Yet  Does not apply How was the rubric created? For example, adapted from DESE’s CEPA rubric, or developed by the department. Please attach rubric along with the assessment as far as you have gone.

15 Quality Tools (District Determined)  MA: Core Curriculum Objectives Standardized Assessments plus Essential Standards  MA: Model Curriculum Units and Curriculum Embedded Performance Assessment Rubrics: DESE  Achieve.org has Common Core aligned and annotated student sample essays at http://www.achievethecore.org/page/507/in-common-effective-writing-for-all-students http://www.achievethecore.org/page/507/in-common-effective-writing-for-all-students  National: Cognitive Complexity and Common Core: Daggert and Hess aligned to CC/PARCC  New York State Resources: Engage New York http://www.engageny.org/common-core- curriculum-assessmentshttp://www.engageny.org/common-core- curriculum-assessments  New York City Common Core Performance Tasks all content areas:**http://schools.nyc.gov/Academics/CommonCoreLibrary/TasksUnitsStudentWork/****http://schools.nyc.gov/Academics/CommonCoreLibrary/TasksUnitsStudentWork/**  Juried units/assessments: www.achieve.org, MA MCU at www.doe.mass.eduwww.achieve.orgwww.doe.mass.edu  Juried rubrics: CEPA, Hess (ELA, math, History, Science), Delaware’s K-12 rubrics for 3 Common Core text types: argument, narrative, informational text  Calibration and Looking at Student Work (LASW) Protocols (on handout) plus http://Nsfharmony.org/protocol/a_z.html http://Nsfharmony.org/protocol/a_z.html  Calibration On Wiki: Rhode Island protocol http://writingtotextbrady.wikispaces.com/DDM+Central+Mass http://writingtotextbrady.wikispaces.com/DDM+Central+Mass

16 Achieve the Core 1) Tri-State Rubric/EQUiP 2) Assessment Standards  Mathematics K-12  Focus strongly where the Standards focus  Coherence: Think across grades and link to major topics within grade  Rigor: In major topics, pursue conceptual understanding, procedural skill and fluency, and application with equal intensity.  ELA 3-12  Complexity: Regular practice with complex text and its academic language  Evidence: Reading, writing, and speaking grounded in evidence from text, both literary and informational  Knowledge: Building knowledge through content-rich non-fiction  The full documents are on the wiki  http://www.engageny.org/resource/tri-state-quality-review-rubric-and-rating-process

17

18

19 Determining the Quality for Your District  Samples  Good  Bad  Mediocre  “Substantial”—2 standards (DESE minimum)  Rigorous—District Determined  Writing to text? PARCC-like?

20 Demonstrating Growth MATH (when accuracy of computation may be a concern)

21 Math Practices Communicating Mathematical Ideas  Clearly constructs and communicates a complete response based on:  a response to a given equation or system of equations  a chain of reasoning to justify or refute algebraic, function or number system propositions or conjectures  a response based on data  a response based on the graph of an equation in two variables, the principle that a graph is a solution set or the relationship between zeros and factors of polynomial

22

23 Essay Prompt from Text Read a primary source about Mohammed based on Muhammad’s Wife’s memories of her husband. Essay: Identify and describe Mohammed’s most admirable quality based on this excerpt. Select someone from your life who has this quality. Identify who they are and describe how they demonstrate this trait. What’s wrong with this prompt? Text- based question?

24 Science Open Response from Text Again, from a textbook, Is this acceptable? Is this recall?

25 Scoring Guides from Text  Lou Vee Air Car built to specs (50 points)  Propeller Spins Freely (60 points)  Distance car travels  1m 70  2m 80  3m 90  4m 100  Best distance (10,8,5)  Best car(10,8,5)  Best all time distance all classes (+5)  235 points total A scoring guide from a textbook for building a Lou Vee Air Car. Is it good enough to ensure inter-rater reliability?

26 Technology/Media Rubric A multi-criteria rubric for technology. What is good, bad, problematical?

27 PE Rubric in Progress. Grade 2 Overhand throw. Looks good?

28 Music: Teacher and Student Instructions

29

30 World Language Scoring Guide and Rubric

31 Table Talk Curriculum Quality Local Standards Edit/discuss the DDM Proposal Form

32 2 DDMs per Educator MEPIDLast NameFirst NameGrade/Dept.DDM1DDM2 DDM3 (optional) JonesBrigitELA 6MCAS 6 Growth Score ELA (SGP) ELA 6 DDM (writing to text) SmithMarion9-12 libraryLibrary Search Tools DDM Indirect: Increase teachers who do research in library. WatsonElsbeth5 ELA teamFountas and Pinnell DDM MCAS 5 Growth ScoreHistory Unit Exam DDM HolmesSharonGrade 2Fountas and Pinnell DDM Galileo DDM Ask for reports (accompanying the drafts of the assessments) using this form grouped (for example) by: Elementary MS as departments HS as departments Special Education District and K-12 Departments: Arts, Music, PE, Nurses, Guidance, Principals, APs, Directors, Coordinators You will be able to determine confusions. Your leadership will be able to begin to plan for scheduling DDMs next year.

33 Audit Assessments; consider all Educators Audit Assessments  K-3 (no MCAS allowed)  4-8 (MCAS Growth score (SGP)  HS no MCAS for direct assessments (unless teacher taught both grade 9 and 10 students)  Self-contained classrooms  IEP measures are acceptable only if it measures growth  Indirect Measures—consider SMART Goals as a start or team goals for ONE DDM Consider all Educators  K-3 Teachers  4-5 or 6 Elementary  Middle School Hist/SS, Wlang., Science, Technology  High School Singletons, changing populations of classes over semesters  Arts, Music, PE, Technology, Media, Library  Nurses, School Psychologists, Administrators, Coordinators  Special Education often has to be decided on a case-by-case basis because of the uniqueness of programs and models used

34 Table Talk June Reporting Scheduling for Full Implementation

35 Calculating Cut Scores How piloting can help clarify standards Pre- Post- Calculation Rubric Inaccuracy when translated to a percentage

36 What do I need to determine high, moderate, or low student growth? 1. Clear directions for scoring individual student work. 2. Clear directions for determining a student’s growth. 3. Parameters for high, moderate, and low student growth. These components may be done independently, or combined.

37 Who scores the DDM?  Outside Organizations  Commercial assessments  Automated methods  Paid raters (e.g., college students, retired teachers)  Teams of Teachers (e.g., all 5 th grade teachers)  Team members rate each other’s students’ responses  Multiple raters score each response  Individual Teachers  Random auditing (rechecking)  By department head, administrator, coach? Internal External

38 Scoring Guides are Important  Support explicitly stating the aspects of student work that are essential.  Can be shared with students (and parents) for the sake of transparency. Continuous Improvement Tip: Plan a review of your scoring guides after the first year of implementation

39 Scoring Guides Must be Clear (DESE)  Scoring Guide Example:  2 points for a correct answer with student work shown correctly  1 point for an incorrect answer with student work shown correctly  Issue:  Not clear about how to score a student with a correct answer with no student work shown or with student work shown incorrectly.  Not clear what “shown correctly” means.

40 Scoring Guides Must be Clear  Scoring Guide Example - Improved:  2 points for a correct answer with either a chart or table showing how the student set up the problem.  1 point for an incorrect answer, but the work demonstrates setting up the problem with a table or picture. Supporting work may include incorrect numbers or other mistakes.  1 point for a correct answer and there is no supporting work or if student work is not organized in a table or chart.  0 point for no correct answer, and work is not organized in a table or chart.  The scoring guide could be further improved by incorporating anchor examples.

41 Cut Scores for L-M-H Growth Pre-test Post test Difference %age growth Diff/pre Student Scores Sorted low to high raw score %diff/pre Teacher score is based on the MEDIAN Score of her class for each DDM Teacher L-M-H Rank 20351575%20%5 Cut score?LOW Growth 2530520%42%15 bottom 20% 30502067%42%20 35602542%50%25 Moderate Growth 35602542%60%25 median teacher score 40703587%62%25 median Teacher score 40652562%67%25 50752550%70%30 50803060%75%35 Top 20%? 50853570%87%35 Cut score? HIGH GROWTH

42 Pre-test or Post-test Rubrics Can Be Converted to % Roobrix.com Roobrix.com 1. Number of levels in Rubric 2. Number of criteria in Rubric 3. Minimum passing grade if student did the work (very poorly) 4. Student’s score  4 Levels  2 Criteria: Topic/Conventions (weighed 2:1=3 criteria)  3. 50  4. Proficient times 2/Advanced  Result=89% Topic 4321 4321 Conv.4321 Total4689%

43 + 20 points is moderate All students can be moderate Or _20% is moderate  Pretest minus post test  Can be raw scores or % of 100  Can be % of improvement  Rank scores low to high  Take top and bottom 10%?15? 20?  Look at student work  Decide on cut scores Pretest minus Posttest

44 Rubric Style AnalyticHolistic Easier to achieve high levels of inter rater reliability Provides formative feedback to students about specific expectations The use of professional judgment supports conclusions Using Rubrics

45 Create Local “Growth” Rubric or Parameters AnalyticHolistic Low Growth Moderate Growth High GrowthLow GrowthModerate GrowthHigh Growth 0 or 34-78 or more Little to no improvement in following writing conventions Average improvement in following writing conventions High improvement in following writing conventions Number of writing mechanics, such as punctuation, capitalization, misspelled word, where student has corrected the mistake in future writing 012 or more Number of examples of improvement of language usage and sentence formation, such as word order, subject-verb agreement, or run-on sentences student has corrected the mistake in future writing.

46 Pre-Test/Post Test  Issue: Floor & Ceiling Effects  Two Tiered Test: Ceiling could be a solution  All students take Qs 1-20 for pretest.  Students who get first 10 Qs correct take Qs 11-30 for posttest  Adding additional easier items: Floor  Issue: Is “Gain” synonymous with “Growth”  Adding additional moderately challenging items to pre and post.

47 Another Option Progress along a scale Keeping the descriptors, not the numbers Scale progress Leveled Readers Local Descriptors (portfolio, stages of writing) 123456789 At risk XX+3 H? Grade level XX+2 M? High Achiever XX+1 L?  Describe what progress looks like  Is it meeting benchmark? Only for average?  What then is moderate progress for struggling students?  What is moderate progress for high performing students?  Use “holistic” rubric  Or use “scale”  DRA or Fountas and Pinnel Example

48 Standard 4-level Rubric—not “granular enough”? WarningNIProficientAdvanced At RiskPre------  post Averagepre------  postM? High Achieverpre------  post Pre/post Same prompt Low Low?

49 AP Rubric of Rubrics Prose Analysis (9 levels Holistic) 9-8 Answers all parts of the question completely. Using specific evidence from the work and showing how that evidence is relevant to the point being made. Fashions a convincing thesis and guides reader through the intricacies of argument with sophisticated transitions. Demonstrates clear understanding of the work and recognizes complexities of attitude/tone. Demonstrates stylistic maturity by an effective command of sentence structure, diction, and organization. Need not be without flaws, but must reveal an ability to choose from and control a wide range of the elements of effective writing. 7-6 Also accurately answers all parts of the question, but does so less fully or effectively than essays in the top range. Fashions a sound thesis. Discussion will be less thorough and less specific, not so responsive to the rich suggestiveness of the passage or precise in discussing its impact. Well written in an appropriate style, but with less maturity than the top papers. Some lapses in diction or syntax may appear, but demonstrates sufficient control over the elements of composition to present the writer’s ideas clearly. Confirms the writer’s ability to read literary texts with comprehension and to write with organization and control. 5 Discusses the question, but may be simplistic or imprecise. Constructs a reasonable if reductive thesis. May attempt to discuss techniques or evidence in the passage, but may be overly general or vague. Adequately written, but may demonstrate inconsistent control over the elements of composition. Organization is attempted, but may not be fully realized or particularly effective. 4-3 Attempts to answer the question, but does so either inaccurately or without the support of specific evidence. May confuse the attitude / tone of the passage or may overlook tone shift(s) or otherwise misrepresent the passage. Discussion of illustrations / techniques / necessary parts of the prompt may be omitted or inaccurate. Writing may convey the writer’s ideas, but reveals weak control over diction, syntax, or organization. May contain many spelling or grammatical errors. Essays scored three are even less able and may not refer to illustrations / techniques at all. 2-1 Fails to respond adequately to the question. May misunderstand the question or the passage. May fail to discuss techniques / evidence used or otherwise fail to respond adequately to the question. Unacceptably brief or poorly written on several counts. Writing reveals consistent weakness in grammar or other basic elements of composition. Although may make some attempt to answer the question, response has little clarity and only slight, if any, evidence in its support. Although the writer may have made some attempt to answer the prompt, the views presented have little clarity or coherence; significant problems with reading comprehension seem evident. Essays that are especially inexact, vacuous, and /or mechanically unsound should be scored 1. 0 A blank paper or one that makes no attempt to deal with the question receives no credit. Rubric from Sharon Kingston

50 4(25)= 100 4(22)= 88 4(18)= 72 4(15)= 60 x x x x 25 + 22 + 18 + 15

51 You add a new level to the rubric Keep L-M-H-Higher LowMHigh First sortDid not answer multi-test question Began to grapple with multiple texts Answered the question proficiently Higher End of year sort Did not answer at all Began to grapple Answered the question Answered with insight, control,

52 Post-Test Only (local determination) Use of Pre Calculus Grade to make predictions around AP Calculus Score Previous Grade Low GrowthModerate GrowthHigh Growth C12-34-5 B1-234-5 A1-345

53 Parameters for high, moderate, and low student growth.

54 Setting Parameters  Qualitative Approach  Engage Teachers  How much growth is high, moderate, or low?  “The Body of the Work” to Validate—real student work  Quantitative Approach  Historical data

55 Music Example Quantitative Approach  Assume we agree to cut scores for high, moderate, and low growth.  How would these results from across the district inform future parameter setting? LowModerateHigh 168744 LowModerateHigh 012135 Pre-Test Post-Test

56  Tracey is a student who was rated as having high growth.  Janey had moderate growth  Linda had low growth  Investigate each student’s work  Effort  Teachers’ perception of musical growth  Other evidence of musical growth Music Example Validation Psychometric process called Body of the Work validation

57 Table Talk Local Cut Scores High-Moderate-Low Student Growth Teacher’s Growth Score is the Median Score for each class included in the DDM (not the average) Begin with top and bottom 10% then go to the “body of the work,” i.e., the student work Rubrics and percentages

58 Calibration Protocols Inter-Rater Reliability Singleton Teacher and Fairness Rhode Island http://www.ride.ri.gov/Portals/0/Uploads/Documents/Teachers-and-Administrators-Excelent- Educators/Educator-Evaluation/Online-Modules/Calibration_Protocol_for_Scoring_Student_Work.pdlf

59 Select Facilitator Individually sort your students’ work into High, Medium, and Low piles Individually describe the characteristics of each pile with rubric or list of criteria. As a group, share characteristics. As a group, develop rubric or list of criteria and exemplars (possibly) Score all Assessments As a group, develop an action plan (next step) for each level and for all students Reflect on process. Share reflections. START HERE PROTOCOLS Protocols for Calibrating Inter-Rater Reliability See handout Develop local Cut Scores Assign L-M-H Growth to each Student Sort student scores to each Teacher Determine Median score As teacher’s Growth Score Modification for DDM Process

60 Brenda is making tree costumes for a play. The list below shows the amounts of the different colors of cloth Brenda will use to make one tree costume. yards brown cloth yards orange cloth yard yellow cloth 1.What is the difference, in yards, between the amount of orange cloth and the amount of brown cloth that Brenda will use to make one tree costume? Show or explain how you got your answer. 2.Brenda plans to use brown cloth for the trunk and branches of the tree, and orange and yellow cloth for the leaves. 3.What is the total amount of cloth, in yards, Brenda will use to make the leaves of one tree costume? Show or explain how you got your answer. 4.Brenda wants to make two tree costumes. 5.What is the total amount of cloth, in yards, Brenda will use to make two tree costumes? Show or explain how you got your answer. MCAS Rubrics and Examples at all levels Grade 6 Math

61 MA DESE Student Work at http://www.doe.mass.edu/mcas/student/2013/ 2 The student response demonstrates a fair understanding of the Number and Operations- Fractions concepts involved in adding and subtracting fractions with unlike denominators, including mixed numbers, by replacing given fractions with equivalent fractions in such a way as to produce an equivalent sum or difference of fractions with like denominators. While some aspects of the task are completed correctly, others are not. The mixed evidence provided by the student merits 2 points

62 K-5 Same Reading; Same Prompt; 6-12 Same Prompt http://www.achievethecore.org/page/507/in-common-effective- writing-for-all-students http://www.achievethecore.org/page/507/in-common-effective- writing-for-all-students Grade 4 On-Demand Writing- Uniform Prompt Which is Better as a Pet, a Dog or a Cat? Many people have a dog for a pet. Some people have cats. Wich is better? I say dog. Maybe you say cat. I just might be able to persaude you in the following. Dogs are great companions for lonely people. They can go for a rousing walk in the park, or a good long nap. Playing games of catch or fetch every day makes good fun. Even a jog on the hottest day could even be enjoyable too. Dogs don’t just provide fun though. They can also provide protection. Dogs are very intelligent. They can be trained to find people or save them. Some don’t even need to be trained. For instance, if someone is trying to break in, your dog might bark and scare them off. Dogs are great for many different reasons. Overall, dogs are awesome pets to have. Have I convinced you though? If you are, then great! If your not then thats okay. It's really up to you. So which one is it going to be? Annotations Introduces a topic clearly States an opinion Provides a concluding section related to the opinion presented Creates an organizational structure in which related ideas are grouped to support the writer’s purpose Provides reasons that are supported by facts and details Links opinion and reasons using words and phrases Exemplar

63 Annotated Exemplar ORQ: How does the author create the mood in the poem? Answer and explanation in the student’s words Specific substantiation from the text

64 Protocols for Looking at Student Work Developing Descriptors Selecting Exemplars L-M-H Protocol with student samples for training Or use chocolate chip cookie exercise

65 Looking at Student work Protocol Chocolate Chip Cookie Rubric Sort H-M-L Create Descriptors from work LowModerateHigh Start with Rubric Sort Work Decide what L-M-H is AdvancedProficientNIWarning

66 “Metacognitive Moment”  What did the exercise demonstrate about  Inter-Rater Reliability  Collaboration  Changing Rubrics  The same word meaning different things to different people  General words that make sense to you when you assessed student work on your own, but may not mean the same thing to others  Insightful  Strong Argument  Specifics—numbers—are they good? Bad?  The need for exemplars

67 4 Texture: Crunchy outside; moist inside cookie and slightly melting chocolate Appearance: Glossy from the use of real butter; fits in the palm of your hand, variation in shapes and modulation of surface perhaps of gourmet or home made origin; see substantial chocolate chunks. Taste: Cookie melts in your mouth and the chocolate is high quality, in chunks, and possibly imported dark chocolate Temperature: Warm, just out of the oven and cooling on a cooling rack 3 Texture: Cake-like, chocolate is at room temperature Appearance: Flat surface, slightly smaller than the palm of your hand, some variation in sizes and surface modulation indicates it may be home made, see some possible bites without chocolate. Looks like commercial chocolate bits. Taste: Cookie is mostly crunchy. The chocolate tastes like commercial chocolate chips. Temperature: Room temperature. 2 Texture: Somewhat dry. Appearance: Flat surface, small, almost skimpy size; maybe from a cafeteria. Many bites without chips. Taste: Somewhat dry. Chocolate is not of high quality; may be old. Temperature: Cool, maybe from cafeteria refrigerator or freezer. 1 Texture: Dry, cold, crumbling Appearance: Porous, skimpy portion, uniform shapes, perhaps commercially made Taste: Dry and chocolate has a white coating as if it is old or of poor quality Temperature: Cold Chocolate Chip Cookie/Holistic Rubric copyright 2013 Ribas Associates

68 “Don’t let perfection get in the way of good.”

69 Potential as Transformative Process When Curriculum, Instruction or Assessment is changed…. Elmore, Instructional Rounds, and the “task predicts performance” Curriculum Instruction Assessment

70 Team Time to Plan And to consult about or share solutions to unique District questions

71 Protocols for Calibration and for Looking at Student Work (LASW)  H-M-L in handout and on wiki  Print materials on line:  Rhode Island Student Work: http://tinyurl.com/mbcppnm  Rhode Island Holistic vs Criterion Referenced: http://tinyurl.com/nkqtdo5  Web Site with detailed steps:  http://www.lasw.org/protocols.html http://www.lasw.org/protocols.html  Quality Performance Assessments: http://tinyurl.com/ls9jwo7http://tinyurl.com/ls9jwo7

72 The Steps Necessary to Get Ready for June Report and After Adapting present assessments Creating new assessments Writing to text for HS Developing and Piloting Assessments Alignment of Content Rigorous and appropriate expectations Approval of assessments Assessing Quality and Rigor Security Calibration of standards and of assessors Rubric quality Analysis of results: High-M-Low Growth Piloting 2 DDMs per educator JUNE REPORT Directions for teachers Directions for students Organizing for the actual assessments Storing, tracking the information 2015 Full Implementation Data storage Data Analysis L-M-H Growth Roster Verification Data team time Interpreting the results Student Impact

73 Additional Juried Resources (MA DESE) on wiki  New York State Resources: Engage New York http://www.engageny.org/common-core-curriculum-assessmentshttp://www.engageny.org/common-core-curriculum-assessments  New York City Common Core Performance Tasks all content areas:**http://schools.nyc.gov/Academics/CommonCoreLibrary/TasksUnitsStudentWork/****http://schools.nyc.gov/Academics/CommonCoreLibrary/TasksUnitsStudentWork/**  MUSIC- Arts Connecticut Student Performance Task Databasehttp://www.ctcurriculum.org/search.asp Connecticut has developed a tool, located atwww.CTcurriculum.org, that allows users to search a database of student performance tasks by grade level and content area. Teachers create and upload student performance tasks that are aligned to state standards along with accompanying scoring rubrics and exemplars. http://www.ctcurriculum.org/search.aspwww.CTcurriculum.org  Minneapolis (MN): Literacy and numeracy assessments K-1 http://rea.mpls.k12.mn.us/uploads/kind_summary_fall_2010_with_2009_bench_2.pdf3) http://rea.mpls.k12.mn.us/uploads/kind_summary_fall_2010_with_2009_bench_2.pdf3  Illinois: Student growth portion of principal evaluation all grades http://www.ilprincipals.org/resources/resource- documents/principal-evaluation/peac_prin_eval_model.pdf (pp. 5, 26)4)http://www.ilprincipals.org/resources/resource- documents/principal-evaluation/peac_prin_eval_model.pdf  Districts may want to look at the performance assessment work underway in several Massachusetts districts through the Quality Performance Assessment InitiativeQuality Performance Assessment Initiative  Teaching Channel - Great source of videos of lessons aligned with Common Corehttps://www.teachingchannel.org/search?utf8=%E2%9C%93&q=common+core&commit=Searchhttps://www.teachingchannel.org/search?utf8=%E2%9C%93&q=common+core&commit=Search  Georgia Department of Education: Math Common Core units https://www.georgiastandards.org/Common-Core/Pages/Math- K-5.aspxhttps://www.georgiastandards.org/Common-Core/Pages/Math- K-5.aspx  Science: Interactive version of Next Generation Science Standards: http://www.nextgenscience.org/next-generation- science-standardshttp://www.nextgenscience.org/next-generation- science-standards  Tri State Rubric EQuIP for Quality Curriculum Units all major content areas and Materials http://www.achieve.org/EQuIPhttp://www.achieve.org/EQuIP  Protocols on wiki http://www.ride.ri.gov/Portals/0/Uploads/Documents/Teachers-and-Administrators-Excellent- Educators/Educator-Evaluation/Online-Modules/Calibration_Protocol_for_Scoring_Student_Work.pdfhttp://www.ride.ri.gov/Portals/0/Uploads/Documents/Teachers-and-Administrators-Excellent- Educators/Educator-Evaluation/Online-Modules/Calibration_Protocol_for_Scoring_Student_Work.pdf  http://www.ride.ri.gov/Portals/0/Uploads/Documents/Common-Core/RIDE_Calibration_Process.pdf


Download ppt "Central Mass. Readiness Center Part II Maintaining Quality Creating Cut Scores Preparing for June Report Preparing for 2015 Full Implementation Dr. Deborah."

Similar presentations


Ads by Google