Using the 2011 NECAP Science Results New England Common Assessment Program.

Slides:



Advertisements
Similar presentations
Inquiry-Based Instruction
Advertisements

Analyzing Student Work
Strand C – In Depth. Competencies 8 and 9: The teacher demonstrates a willingness to examine and implement change, as appropriate. The teacher works productively.
Examining Student Work. Ensuring Teacher Quality Leader's Resource Guide: Examining Student Work 2 Examining Student Work Explore looking at student work.
ESEA: Developing the Paraprofessional Portfolio Prepared by Carolyn Ellis Logan, Consultant Professional Development/Human Rights Department Michigan Education.
Top ten non compliance findings from the Office for Exceptional Children from their Special Education Onsite Reviews.
ASSESSMENT LITERACY PROJECT4 Student Growth Measures - SLOs.
Teacher Leadership Initiative Trainings
September 2013 The Teacher Evaluation and Professional Growth Program Module 2: Student Learning Objectives.
NCOSP Learning Community Forum March 2007 Science Curriculum Topic Study Examining Student Thinking.
Developing and Supporting Highly Effective Teachers in Every Classroom Leaders of Learning Implementation Norman Public Schools Date.
Tracy Unified School District Leadership Institute – “Leading the Transformation” Breakout Session Authentic Data Driven Decision Making July/August 2014.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
September 12, 2014 Lora M. McCalister-Cruel BDS District Data Coach Bay District Schools Data Analysis Framework.
Teachers directing the work of paraprofessionals
Welcome What’s a pilot?. What’s the purpose of the pilot? Support teachers and administrators with the new evaluation system as we learn together about.
1 Let’s Meet! October 13,  All four people have to run.  The baton has to be held and passed by all participants.  You can have world class speed.
Examining Monitoring Data
Improving Teaching and Learning: One District’s Journey Curriculum and Instruction Leadership Symposium February 18-20, 2009  Pacific Grove, CA Chula.
New Teacher Training The Effective Teacher: Classroom Management Rosalie Gardner Induction and Mentoring Project Coordinator August 4, 2009.
2007/2008 Prepared for Los Angeles County Office of Education BTSA Induction Technology Integration Training #3 Los Angeles County Office of Education.
Interim Assessment Training September 20 & 26,
GTEP Resource Manual Training 2 The Education Trust Study (1998) Katie Haycock “However important demographic variables may appear in their association.
Predicting Patterns: Lenawee County's Use of EXPLORE and PLAN DataDirector 2011 User Conference Dearborn, Michigan.
Literacy Partner’s Meeting Wednesday, October 22 nd Moderated Marking: The What, The Why, The How.
Timberlane Regional School District
Data for Student Success Using Classroom Data to Monitor Student Progress “It is about focusing on building a culture of quality data through professional.
1 New Hampshire Statewide Assessment The 2010 NECAP Reports February 2011.
Using the 2007 NECAP Reports February, 2008 New England Common Assessment Program.
Idaho State Department of Education Accessing Your ISAT by Smarter Balanced Data Using the Online Reporting System (ORS) Angela Hemingway Director, Assessment.
Classroom Assessment for Student Learning March 2009 Assessment Critiquing.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Good Morning Inver Grove Heights Year Two: Leading For Excellence.
Curriculum Report Card Implementation Presentations
11 New Hampshire Statewide Assessment Using the 2008 NECAP Science and NH Alternate Reports December 8 & 11, 2008.
1 PD3 P rofessional D evelopment D ecisions Using D ata.
Tier I: Implementing Learning Walks & Instructional Rounds OrRTI Conference Tara M. Black, M.Ed. May 9,
Guide to Test Interpretation Using DC CAS Score Reports to Guide Decisions and Planning District of Columbia Office of the State Superintendent of Education.
Using the 2009 NECAP Reports February 2010 New England Common Assessment Program.
Understanding Alaska Measures of Progress Results: Reports 1 ASA Fall Meeting 9/25/2015 Alaska Department of Education & Early Development Margaret MacKinnon,
Using the Fall 2011 NECAP Results New England Common Assessment Program.
TPEP Teacher & Principal Evaluation System Prepared from resources from WEA & AWSP & ESD 112.
11 Using the 2010 NECAP Reports February/March, 2011.
1 New Hampshire Statewide Assessment Using the 2009 NECAP Reports January, 2010.
Introduction to Surveys of Enacted Curriculum Presentation: Introduce SEC to Educators [Enter place and date]
1 New England Common Assessment Program Guide to Using the 2005 NECAP Reports: Companion PowerPoint Presentation March/April 2006.
Candidate Assessment of Performance Conducting Observations and Providing Meaningful Feedback Workshop for Program Supervisors and Supervising Practitioners.
Virginia State University Summer Data Institute: Digging into Data to Identify the Learner-Centered Problem Presented by: Justina O. Osa, Ed.D.
Aligning Academic Review and Performance Evaluation AARPE Session 5 Virginia Department of Education Office of School Improvement.
Student Learning Objectives (SLO) Resources for Science 1.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Mini-Project #2 Quality Criteria Review of an Assessment Rhonda Martin.
1 Scoring Provincial Large-Scale Assessments María Elena Oliveri, University of British Columbia Britta Gundersen-Bryden, British Columbia Ministry of.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Writing a Professional Development Plan.  Step 1–Identify Indicators to be Assessed  Step 2 –Determine Average Baseline Score  Step 3 –Develop a Growth.
Planning Classroom Assessments Identifying Objectives from Curricular Aims…
Student Achievement Through Teacher Evaluation Presenters Dr. Jane Coughenour Dr. Karen Chapman Mr. Michael Matta.
Using the NECAP Analysis and Reporting System February 2013.
Steps to Creating Standards-Based Individualized Education Programs The following highlights the major steps Committees on Special Education (CSEs) can.
Driving Instruction through the use of quality data and collaborative decision making.
1 Using the 2009 NECAP Reports February 1-5, 2010.
Teacher Evaluation “SLO 101”
Instructional Rounds Peninsula School District
Instructional Rounds Peninsula School District
Vision 20/20: Checks and Balances
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Linking Evaluation to Coaching and Mentoring Models
Developing Quality Assessments
Examining Student Work
Presentation transcript:

Using the 2011 NECAP Science Results New England Common Assessment Program

2 Dr. Kevon Tucker-Seeley Assessment Specialist – NECAP Manager Office of Instruction, Assessment & Accountability Peter McLaren Science and Technology Specialist Office of Instruction, Assessment & Accountability Jennifer Golenia Science and Technology Specialist Office of Instruction, Assessment & Accountability Lindsay Wepman Assessment Specialist Office of Instruction, Assessment & Accountability Welcome and RIDE Introductions

NECAP Service Center Elliot Dunn NECAP Science Program Manager ext Harold Stephens NECAP Program Director ext Carole Soule NECAP Program Manager ext Measured Progress Introductions

NECAP Service Center: Kellie Beaulieu: NECAP Program Assistant ext Mellicent Friddell: NECAP Program Assistant ext Alison Cady: NECAP Program Assistant ext Measured Progress Introductions

Review the different types of NECAP Science reports and share State results Demonstrate a simple data analysis protocol using NECAP Science Data Share some observations from the NECAP benchmarking process 5 Goals of the Workshop

6 Workshop Reference Materials

7 Types of NECAP Science Reports Student Report (Confidential) Information for Parents Item Analysis Report (Confidential) School level by student Results Report (Public) School and District level Summary Report (Public) District/State level Student Level Data Files (Confidential) Excel/csv files by grade on district and school confidential site

8 Accessing Your Confidential Reports

Select “Interactive” to view Interactive Reports Select “Reports” to view Static Reports 9 Accessing Your Confidential Reports

These accounts are intended for the school teachers who will see only the students to whom they have been assigned by the principal. This account is intended for the school principal. One school principal account exists for each school. The principal assigns all accounts for teachers within the school This account is for the district level user and allows access to all reports. District Administrator (Superintendent) School Level (Principal) Classroom (Teacher) Classroom (Teacher) Classroom (Teacher) Classroom (Teacher) NECAP Analysis and Reporting System (NARS) Account Creation Hierarchy Password Assistance: Use the following list to determine whom to contact for assistance with your User Name and Password: Superintendents – Contact NECAP Service Center at Principals – Contact NECAP Service Center at Teachers – Contact School Principal

Using NECAP Science Data Focus is improving student learning – The goal is to increase student achievement Engage in collaborative discourse about data – Using relevant, timely data to influence educational decisions is considered a best practice Discussion of data requires a structured approach – Protocols, ground rules, and shared talk time Remember, NECAP Science data is for assessing school-level achievement, NOT individual student achievement – Look for trends and patterns

NECAP Science Data Today we’ll take a look at: Demo District Results Demo School Results Test Items and Item Analysis Reports State Results

Framing the Question Collecting Data Analyzing Data Organizing Data-Driven Dialogue Drawing Conclusions – Taking Action Monitoring Results Phases of Collaborative Inquiry Adapted from N. Love Using NECAP Science Data

Using Data―A Simple Approach 1.Observe the data – Look at the data with your table. – Make objective statements about the data. Avoid statements like, “It seems…” or “I think…” – What do you notice?

Using Data―A Simple Approach 2. Discuss – Talk with your colleagues at your table. – Why do think the data might look this way?

Using Data―A Simple Approach 3. Action steps – What are some other sources that could confirm or refute this data? – What does the data mean for my school regarding curriculum and instruction? – How can I dig deeper?

Using Data―A Simple Approach 4. Reflection (time permitting) – What did you learn from this activity? – Are there new things that you’d like to bring back to your school?

Tying it Together Limitations – This is one source of data. You can’t make major programmatic changes from this alone. – Look to other sources of data to confirm or refute. – Think about curricular and instructional action steps. Look for professional development around data use in the 2012–13 and 2013–14 school years.

Percentage of Rhode Island Students by Achievement Level GradeYearSubjectSBPPPPPwDP+PwDCHG Average SS 410/11SCI /10SCI /09SCI2040 < /11SCI /10SCI314722< /09SCI384517< /11SCI /10SCI /09SCI

10/11 Comparison of NECAP States by Achievement Level GradeStateSubjectSBPPPPPwDP+PwDCHG Average SS 4RISci NHSci VTSci.--- 8RISci NHSci VTSci RISci NHSci VTSci.--- Note: Vermont’s data is embargoed until 9/28. RI will do a public release on 9/27

Benchmarking Short Answer and Constructed Response Items Short answer items receive a score from zero to two. Constructed response items receive a score from zero to three or zero to four. Zeros are employed when a student produces some work, but the work is totally wrong or irrelevant or if he or she leaves the item blank. For purposes of aggregating item results, blanks and zeros both count as zero points towards a student’s score.

Preparation for Benchmarking The work in preparation for scoring student responses included: Development of scoring guides (rubrics) Content specialists from the NH, RI, and VT Departments of Education Measured Progress’s test developers Selection of “benchmark” responses Examples of student work at different score points for each item Used in training and continuous monitoring of scorer accuracy

Scorer Training Scorer training consisted of: Review of each item and its related content and performance standard Review and discussion of the scoring guide and multiple sets of benchmark responses for each score point Qualifying rounds of scoring in which scorers needed to demonstrate a prescribed level of accuracy.

NECAP Benchmarking Process Sample Responses Pulled Lead Scorer chooses samples of responses from open response items (CRs or SAs) Sample responses are chosen as representative of all responses. Sample responses are initially scored based upon rubric for the item. Science Specialists Science Specialists from NECAP States review the sample items and come to consensus on score based upon rubric “Anchor” Papers are selected for scorers to use as a guide when assessments are officially scored. Science specialists also review rubrics to help provide clarity for scorers. Scorers All scorers are trained and tested (10 response qualifying round) Scorers have to pass test in order to move to officially scoring items.

Lessons Learned from NECAP Benchmarking What does it mean – to identify? – to describe? – to explain? How does this look over grade levels? What does it mean to compare and contrast?

Lessons Learned Grade 4 What is a characteristic? What observations of information can be gained from charts and diagrams? What patterns can be distinguished from graphs? How can students better use evidence from data in their explanations?

Lessons Learned Grade 8 When should line graphs be used? When should bar graphs be used? How can students better support or refute their predictions/hypotheses using evidence from data? Why is it important to use multiple trials in an investigation?

Lessons Learned Grade 11 What are typical sources of error? – How can they affect the outcomes of an investigation? How can skills in the analysis of data be demonstrated more deeply? By 11 th grade, students should know about the use of proper units of measurement.

Development of Assessment Targets Participation in annual test item review committee & bias and sensitivity review committee Use of classroom teacher judgment data Participation in standard setting panels Technical Advisory Committee work 29 Involvement of Local Educators in NECAP Science

30 Use the data Share progress with parents Identify gaps in curriculum, instruction, and assessment Identify students in need of accommodations Anticipate types of accommodations NimbleTools will be an option – Spring 2012 Plan for administration now Review what went well and what went not so well during the 2011 administration Begin preparing your students Share the reference sheets Released Items documents Preparing Students for NECAP: Tips for Teachers to Share with Students Preparing for 2012 NECAP Science

31 Questions? Peter McLaren Science and Technology Specialist Office of Instruction, Assessment & Accountability Jennifer Golenia Science and Technology Specialist Office of Instruction, Assessment & Accountability