Using Data to Promote Student Success Analyzing and Interpreting EQAO Results.

Slides:



Advertisements
Similar presentations
Understanding the North Carolina School Report Card.
Advertisements

Jamesville-DeWitt School Report Card Presented to the Board of Education May 10, 2010.
Online Fidelity in Mathematics Martin Kollman. Online Fidelity in K-6 Online material and assessment programs provide successful fidelity for teachers.
1 Joe Serna, Jr. Charter School Annual Report Lodi Unified School District Board of Education November 16, 2010 Michael Gillespie, Principal.
Executive Summary  Did not meet API/AYP goals in language arts for the school or significant subgroups.  Did not meet API/AYP goals in math for the school.
Rutland High School Technical Review Visit Looking At Results Planning Next Steps Learning About Resources.
How to Analyze the AEIS Report
By: Michele Leslie B. David MAE-IM WIDE USAGE To identify students who may be eligible to receive special services To monitor student performance from.
A Study of Teacher Competencies and Involvement in Transition Services for Middle and High School Students with Disabilities Conducted by: John Mattos.
Data Inquiry Teams: Part I Principal Strand NTI December 2014.
SMART Goal Target Setting: Using Individual Student Data Josephine Virgilio.
Data Analysis Protocol Identify – Strengths – Weaknesses – Trends – Outliers Focus on 3 – Strengths – Weaknesses.
Classroom Assessment A Practical Guide for Educators by Craig A
Ways to Utilize the 2012 FCPS Working Conditions Survey April 11, 12, 13 Laurie Fracolli, Sid Haro, and Andrew Sioberg.
Hawthorne Elementary School Review of Data School Improvement Process Fall, 2009.
Introduction to NYC Teacher Data Initiative Training for Schools Fall 2008.
Mrs Palesa Tyobeka Deputy Director-General: General Education 26 August 2009.
Analyzing Access For ELL Scores Tracy M. Klingbiel Nash Rocky Mount School District October 11, 2010.
Planning, Assessment & Research Analysis and Use of Student Data Part I.
FFT Data Analysis Project – Supporting Self Evaluation  Fischer Family Trust / Fischer Education Project Extracts may be reproduced for non commercial.
Miyo Wahkohtowin Community Education Authority Maskwacis Student Success Program Presented by Ahmad Jawad March 8, 2011.
© 2013 K12 Insight Central Office Climate Survey Results Las Cruces Public Schools March , 2013.
Dr. Engr. Sami ur Rahman Assistant Professor Department of Computer Science University of Malakand Research Methods in Computer Science Lecture: Research.
Data for Student Success Regional Data Initiative Presentation November 20, 2009.
5-Step Process Clarification The 5-Step Process is for a unit, topic, or “chunk” of information. One form should be used for the unit, topic, etc. The.
MARTIN COMMUNITY COLLEGE ACHIEVING THE DREAM COMMUNITY COLLEGES COUNT IIPS Conference Charlotte, North Carolina July 24-26, 2006 Session: AtD – Use of.
+ Equity Audit & Root Cause Analysis University of Mount Union.
NECAP DATA ANALYSIS 2012 Presented by Greg Bartlett March, 2013.
Pontotoc City School District. Pontotoc City School District believes LEARNING is a priority, a need, and a desire. To be successful, we must nurture.
Measuring of student subject competencies by SAM: regional experience Elena Kardanova National Research University Higher School of Economics.
Resource Allocation & School Planning Councils in Your District Presenter: Sterling Olson.
HOW DO I USE THINKGATE? Presented By: Mercy Aycart From: South Miami Senior High Data have no meaning…meaning is imposed.
Important to Remember! Name of tool used to collect data. Date the tool was administered.
FEBRUARY KNOWLEDGE BUILDING  Time for Learning – design schedules and practices that ensure engagement in meaningful learning  Focused Instruction.
Van Hise Elementary School Review of Data School Improvement Process March 3, 2009.
2005 EQAO Highlights. LDSB Participation Rate Gr. 3 Contextual Information.
“EQAO has an accountability mandate to provide data that inform classroom teaching practices and contributes to improved student achievement in Ontario’s.
EQAO Results Grade 9 Assessment of Mathematics Learning from data is central to our overarching responsibility as educators – being accountable.
0 1 1.Key Performance Indicator Results ( ) KPI Survey Statistics Student Distribution by Year in Program KPI Overall Results Student Satisfaction.
USING GRAPHICAL DISPLAY by John Froelich A Picture is Worth a Thousand Words:
DWW: Doing What Works Recommendation 1. Make data part of an ongoing cycle of instructional improvement. Recommendation 2. Teach students to examine their.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
Curriculum Services Team USING THE ITEM INFORMATION REPORTS.
Virginia State University Summer Data Institute: Digging into Data to Identify the Learner-Centered Problem Presented by: Justina O. Osa, Ed.D.
1 SUPPORTING PEDAGOGICAL CHANGE IN NMR SCHOOLS PROJECT Briefing of NMR secondary schools 11 February, 2010 Jean Russell, Graeme Jane, Graham Marshall.
The Individual Education Plan (IEP) Toronto District School Board January 20, 2015.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
School Improvement Planning Assessing School Impact October 6, 2010 Robert Dunn Superintendent of Education York Region District School Board.
Minnesota Manual of Accommodations for Students with Disabilities Training January 2010.
Understanding AzMERIT Results and Score Reporting An Overview.
Using the Grade 9 Applied Mathematics Assessment IIR Looking at the Item Information Report: Student Roster Files.
Action Research. What is Action Research?  Applied focus  Specific, practical issue  Solve problem  Improve practice.
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
Balochistan Education Foundation: Quality Improvement Measures in Our Programs Balochistan Education Support Project (BESP)
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
A Closer Look at CRCT Data Comparing LaBelle, Cobb County School District, and State Data LaBelle Elementary (544 students enrolled) Intended use for.
2005 Highlights: EQAO Grade 9 Mathematics. Student demographics Applied  Number:  Female: %  Male: %  First semester: %  Second semester: %  ESL:
Planning, Assessment & Research Analysis and Use of Student Data Part I Administrative Academy Los Angeles Unified School District.
[School Name]’s Student Perception Survey Results This presentation is a template and should be customized to reflect the needs and context of your school.
White Pages Team Grey Pages Facilitator Team & Facilitator Guide for School-wide Reading Leadership Team Meetings Elementary.
Framework for Enhancing Student Learning School District 63 (Saanich)
1 Perspectives on the Achievements of Irish 15-Year-Olds in the OECD PISA Assessment
APS SCHOOL IMPROVEMENT PROCESS August, District Goal By June 2015, APS district and schools will cut the achievement gap in one-half for all students.
Reporting in Schoolnet Trainer Trainer/Consultant.
NAEP What is it? What can I do with it? Kate Beattie MN NAEP State Coordinator MN Dept of Education This session will describe what the National Assessment.
Examining Achievement Gaps
WASL Science: Grade 10 Specific Title Slide for School.
US History EOC Data Overview
EQAO Grade 10 Results Ontario Secondary School Literacy Test
Measuring College and Career Readiness
Presentation transcript:

Using Data to Promote Student Success Analyzing and Interpreting EQAO Results

Guiding Principles:  Large-scale assessment provides comparable year-to-year data on student achievement that can be helpful to schools and boards in improvement planning  Results should be considered in conjunction with other school information related to student performance  Each school, and school community, is unique and results need to be interpreted in context  It is important to use caution when interpreting percentage results where numbers are small

4 Cs of Interpreting Data Are the data:  Complete?  Consistent?  Comparative?  Concealing? Source: Assessment Training Consortium/ Ottawa Carlton District School Board

Complete  Large-scale assessment is a snapshot  Consider other information: Classroom assessments, report cards District assessments Surveys Demographic data School characteristics Source: Assessment Training Consortium/ Ottawa Carlton District School Board

Consistent: Are there any surprises?  Are results from different sources consistent? e.g. EQAO and report cards  Are results consistent with expectations?  Consider consistencies with respect to: Individual students School results Past experience Source: Assessment Training Consortium/ Ottawa Carlton District School Board

Comparisons  Norms (school vs. provincial data)  Similar schools  Curriculum standard  Sub-groups (e.g. boys/girls, different programs)  Trends over time Source: Assessment Training Consortium/ Ottawa Carlton District School Board

Concealing  Look at distribution of scores  Averages may be misleading, esp. if numbers are small Source: Assessment Training Consortium/ Ottawa Carlton District School Board

Contextual Data  EQAO sources: Demographics of eligible/participating students Participation rates Student Questionnaire responses Principal Questionnaire responses  Other sources: Socio-economic data Enrollment patterns Mobility rates Absenteeism rates

Contextual Data  Key Questions: How do the current contextual and demographic data for your school compare with previous years? How does your school’s demographic information compare with that of the board and province? How do your school’s exemption, deferral and absentee rates compare with the board and province? Does your school have significant numbers of ESL/ELD or special education students? What do students report about computer access and use and reading and writing outside of school?

Achievement Data  EQAO sources: Detailed School Results Item Information Reports Student Questionnaire responses Teacher Questionnaire responses  Other sources: Report card data District assessments Classroom assessments Reading assessments Diagnostic assessments

Achievement Data  Key Questions: How do overall results compare over time? Is the pattern of results similar to those of the board or province? How do the overall results compare to those of similar schools? Are there differences in the performance of males and females? Is the performance of ESL/ELD students significantly different from that of other students?

Item Information Reports: Ideas for Reviewing Reports  Identify the strong performance areas in the school On which items did your school perform well?  Identify the performance areas that require attention Which items presented a challenge for your students?  Where do you expect the school to be in relation to the Board and Province?  Relate other data to the observations  Consider what improvement strategies are indicated for the areas needing attention

Examining your data:  Involves a process of asking questions and searching for meaning  Provides insight into strengths and weaknesses  Is only one step in the process of school improvement