The New MCAIII Science Benchmark Reports for 2015 Minnesota Department of Education Science Assessment Specialist Jim WoodDawn Cameron

Slides:



Advertisements
Similar presentations
Professional Learning at Thornlea February – June 2009.
Advertisements

PVAAS School Consultation Fall 2011 PVAAS Statewide Core Team
ReadiStep Summary of Answers and Skills (SOAS) Tutorial.
Elementary school teachers receive the least training in history content and instructional methods specific to social studies. Experienced teachers may.
SMART Goal # 1 – By June 2009, HPEDSB students will independently use the skill of making meaningful connections between information and ideas in a reading.
Five Fundamentals for School Success Consortium on Chicago School Research 2007 Individual Survey Reports Presented by Holly Hart.
Communicating through Data Displays October 10, 2006 © 2006 Public Consulting Group, Inc.
VERTICAL SCALING H. Jane Rogers Neag School of Education University of Connecticut Presentation to the TNE Assessment Committee, October 30, 2006.
Welcome to the TAYLOR ELEMENTARY SCHOOL Introduction to MCAS.
The New England Common Assessment Program (NECAP) Alignment Study December 5, 2006.
Review Planning Faribault Public Schools DATA DAY.
Use of Reading Data to Drive Instruction in District 200 April 13, 2012.
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Interpreting Assessment Results using Benchmarks Program Information & Improvement Service Mohawk Regional Information Center Madison-Oneida BOCES.
Out with the Old, In with the New: NYS Assessments “Primer” Basics to Keep in Mind & Strategies to Enhance Student Achievement Maria Fallacaro, MORIC
Science MCA-III Updates Minnesota Department of Education Science Assessment Specialist Jim WoodDawn Cameron
Jasmine Carey CDE Psychometrician Interpreting Science and Social Studies Assessment Results September 2014.
U-32 Spring 2013 NECAP Presentation March 27,
Read the Standards! Read the Standards! How do you teach the standards? Accessing and Using the MCA-III Math Data on the AIR Website January
Review and Validation of ISAT Performance Levels for 2006 and Beyond MetriTech, Inc. Champaign, IL MetriTech, Inc. Champaign, IL.
Mathematics Initiative Office of Superintendent of Public InstructionWERA OSPI Mathematics  Mathematics is a language and science of patterns.
1 There is no “quick fix” So, think crock pot, not microwave Strategies… Before that, we must say something very important!
Katie Bowler, Science and Tech/Eng Assessment Manager, Department of Elementary and Secondary Education.
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
Melrose High School MCAS Presentation October 22, 2013.
Assessment Literacy Interim Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
Woodman PAT scores Language Arts Acceptable standardExcellence EnrolledWritersProvince(writers)CBE writersEnrolledWritersProvince(writers)CBE.
Spring 2009 MCAS Results. Dedham Elementary Schools.
Scale Scoring A New Format for Provincial Assessment Reports.
End of Grade Test Understanding the Score Report.
USING GRAPHICAL DISPLAY by John Froelich A Picture is Worth a Thousand Words:
8th Grade Criterion-Referenced Math Test By Becky Brandl, Shirley Mills and Deb Romanek.
DEA/FCAT is YOUR game! You are the greatest teachers!
Strategies… Before that, we must say something very important! There is no “quick fix”  So, think crock pot, not microwave.
Using UDL to Set Clear Goals and Support Every Student’s Learning.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
Writing Effective Recommendation Letters
Welcome to MMS MAP DATA INFO NIGHT 2015.
MAP: Measured Academic Progress© Parent Coffee February 10, 2010.
Understanding the 2015 Smarter Balanced Assessment Results Assessment Services.
2009 Report Card and TVAAS Update Recalibration 2009 October 26, 2009.
Understanding AzMERIT Results and Score Reporting An Overview.
NECAP Presentation for School Year March 26,
Using Data to Promote Student Success Analyzing and Interpreting EQAO Results.
The Normal Distribution and Norm-Referenced Testing Norm-referenced tests compare students with their age or grade peers. Scores on these tests are compared.
A Closer Look at CRCT Data Comparing LaBelle, Cobb County School District, and State Data LaBelle Elementary (544 students enrolled) Intended use for.
ReadiStep and PSAT/NMSQT Summary of Answers and Skills & Advantage: SAT PSAT/NMSQT.
Minnesota Assessments June 25, 2012 Presenter: Jennifer Dugan.
Minnesota Academic Standards and Graduation Requirements Assessment and Accountability Task Force Dr. Beth Aune July 31, 2012 “Leading for educational.
Developmental Reading Data Source: Local Banner Data (Open Close Report)
Granby Public Schools Annual Continuous Progress Review Presented by Diane Dugas Director of Curriculum September CMT Review.
Materials FCAT Achievement Levels Test Content and Format, pg. 12 (FCAT Skills Assessed) Examples of Reading Activities Across Cognitive Complexity Levels,
Lecturer - Dr Justin Rami Assessment & Feedback Lecturer – Dr Justin Rami.
NAEP What is it? What can I do with it? Kate Beattie MN NAEP State Coordinator MN Dept of Education This session will describe what the National Assessment.
About My Fitness Test Results UNIT 1 By _______________________ P5– Interpret their test results and personal level of fitness. M2 – Explain their test.
ACCESS for ELLs Score Changes
Examining Achievement Gaps
It Begins With How The CAP Tests Were Designed
Measures of Academic Progress (MAP) – Review, Reports, and Resources
Measures of Academic Progress (MAP) – Overview
2013 Wisconsin Health Trends: Progress Report
SAT/ACT Which test should you take?
Interpreting Science and Social Studies Assessment Results
SAT/ACT Which test should you take?
Cardinal Convo April &
EPAS Educational Planning and Assessment System By: Cindy Beals
Scoring Open-Ended Items
Proposal for changes to KS3 Monitoring and Reporting
“Reviewing Achievement and Value-Added Data”
Presentation transcript:

The New MCAIII Science Benchmark Reports for 2015 Minnesota Department of Education Science Assessment Specialist Jim WoodDawn Cameron “Leading for educational excellence and equity. Every day for every one.”

What we want from MCA’s “ How are students doing on specific benchmarks?” education.state.mn.us 2

What we get from the MCA’s “ Strand Level Data” education.state.mn.us 3

Effect size is a statistical method of summarizing group differences CLES in these graphs expresses the probability that a student selected at random from a school will receive a higher score on the item relative to the overall performance expected for that school For this comparison, the overall performance expectation for the school is placed at 0.50 on the CLES scale Common Language Effect Size (CLES) education.state.mn.us 4

Benchmark Reports education.state.mn.us 5

Focuses on variation within a school The dashed vertical line fixed on the 0.50 CLES represents the overall performance expectation for the school Solid gray line represents the performance expected from students at the state mean in overall science ability Benchmark information should only be compared to the school expectation line fixed at 0.50 on the CLES axis What’s Different From Past Years education.state.mn.us 6

Initial impression Identify strengths and weaknesses on report Questions Your Schools Data education.state.mn.us 7

Interpreting the Benchmark Reports education.state.mn.us 8 It is important to frame any interpretation in the context of the school’s environment. Experience with the science curriculum, instruction, and data from other classroom assessments is critical to making meaningful inferences from this report.

All data is relative to the expected school average Location of items on the scale do not provide information about the difficulty of the individual test questions There may be more than one item assessing a particular benchmark The number of items on each report corresponds to the number of items on the assessment for each grade Color codes and position of items in the graphs do not correspond to achievement levels education.state.mn.us 9 CAUTION!!! What we can say and not say about the data

Watch the dotted lines – different colors may overlap. In the cases where there is more than 1/2 overlap, the items may be considered statistically equivalent Keep an eye on the CLES metric when comparing schools. The horizontal axis is adjusted to fit each individual school’s data. New format makes does not allow for easy comparison of schools, but deeper understanding within a school Things to watch out for education.state.mn.us 10

Digging into the Reports Test Specifications  Test Design  Content Specifications Look for patterns in data  Do certain standards show a pattern within the year and over time? Start asking questions  Classroom assessments Does the data match what is happening in the classroom education.state.mn.us 11

Consistent with other content areas Allows staff to quickly identify strengths and weaknesses within a school Allows for more flexibility in test design Why The Change? education.state.mn.us 12

Old Format Benchmark Reports Transition year- –Reports available upon request from MDE –Careful with interpretation between reports education.state.mn.us 13

Test Development Process Test Specs Storyboard Writing Item Review Field Test Items Data Review Operational Item Pool Operational Test Bias Review Achievement Levels Storyboard Review Bias Review Item Writing education.state.mn.us 14

Get involved! Assessment Advisory Panels—ongoing Register to be invited to panels at  Educator Excellence  Testing Resources  Register for Advisory panels education.state.mn.us 15

education.state.mn.us 16