CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.

Slides:



Advertisements
Similar presentations
Using CEM Data in Practice: Feedback from CEM Secondary Assessments
Advertisements

The MidYIS Test.
You can use this presentation to: Gain an overall understanding of the purpose of the revised tool Learn about the changes that have been made Find advice.
Introduction to CEM Pre-16 Computer Adaptive Assessments Glasgow 13 th February 2013
MIDYIS – A BRIEF INTRODUCTION
Peter Finlayson Quality improvement Officer February 2013.
An Introduction to CEM Secondary Monitoring Systems Assessment for Excellence S1/S2 Baseline Assessment (MidYIS) & S2 Curriculum-based.
Yr 12 Parents Forum: ALIS data West Island School November 2012.
Interactive Computerised Assessment System
 Cognitive  Ability  Testing  Objective testing – independently marked and interpreted by an outside agency  Completed online with both auditory.
Neil Defty, Secondary Systems Programme Manager New Developments from CEM Bristol, April 2011.
By: Michele Leslie B. David MAE-IM WIDE USAGE To identify students who may be eligible to receive special services To monitor student performance from.
Assessment and Data Year 7 and 8
Use of Data At start of each academic year, HODs are provided with the following data GCE and GCSE Broadsheets and summaries Residual data for courses,
Using MidYIS to inform Teaching & Learning
Key Messages The role of the link teacher is to help promote numeracy in the school Developing positive attitudes and an awareness of numeracy is the responsibility.
HM Inspectorate of Education 1 Literacy and Numeracy Across the Curriculum May 2008.
Secondary Information Systems
FFT Data Analysis Project – Supporting Self Evaluation  Fischer Family Trust / Fischer Education Project Extracts may be reproduced for non commercial.
Secondary Information Systems
Year 7 Transition Project. AIMS To provide support in literacy and study skills for year 7 pupils. The Helen Arkell Centre are providing twice weekly.
M.Greenaway. Analysing Data.
Neil Defty, Secondary Systems Programme Manager New Developments from CEM London, January 2011.
Introduction to CEM and Computer Adaptive Assessments
1 Using CEM’s Systems to Monitor Pupils’ Progress
Introduction to Value-Added Data Dr Robert Clark.
Yr 7 Parents Forum: Introduction to MidYIS West Island School October 2013.
Interpreting Feedback from Baseline Tests – Predictive Data Course: CEM Information Systems for Beginners and New Users Day 1 Session 3 Wednesday 17 th.
CEM Data and Self-Evaluation Dr Robert Clark ALIS Project Manager.
Yr 12 Parents Forum: ALIS data West Island School October 2013 Mike Williams (Data Development Leader)
Differential Effectiveness Report Estimates All Years Multi-Subject Estimate Table Multi-Subject Estimate Graph All Schools, Similar Intake, Same District.
Baseline testing in Reporting and Assessment Patrick Moore – Head of Assessment and Reporting.
1 Assessment and Monitoring in the Primary Years
Monitoring Schools Using CEM Data David Tickner Deputy Head Newcastle School for Boys.
The use of CEM data for teachers of pupils with Specific Learning Difficulties.
Introduction to CEM Secondary Pre-16 Information Systems Neil Defty Secondary Systems Programme Manager.
Working with Colleagues, Parents and Students Course: Using CEM Data in Practice Day 2 Session 3 Thursday 28 th February 2013 Rob Smith: CEM Inset Provider.
Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011.
Belfast VPs’ cluster group Value Added May Why Measure Value Added? As a means of self-evaluation Raising academic standards As a measure of effectiveness.
Neil Defty, Secondary Systems Programme Manager, CEM November An Introduction to CEM Secondary Monitoring Systems MidYIS (S1.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Using CEM Data for Self-Evaluation and Improvement Running Your School on Data 7 th June 2011
Monitoring Achievement and Progress in Independent Schools Running Your School on Data January 2011 Peter Hendry: CEM Consultant
Interpreting Feedback from Baseline Tests - Whole School & Individual Student Data Course: CEM Information Systems for Beginners and New Users Day 1 Session.
National Curriculum – changes and implications Assessment – changes and implications SATs 2016 – Year 2 & 6.
Making the most of Assessment Data in the Secondary Years Dr Robert Clark.
Understanding ERB Scores
WestEd.org Washington Private Schools RtI Conference Follow- up Webinar October 16, 2012 Silvia DeRuvo Pam McCabe WestEd Center for Prevention and Early.
Hertfordshire County Council The Role of the Secondary Assessment Co-ordinator Day One 5 th July 2005.
TCAI: Lessons from first Endline TCAI Development Partners Feb 27, 2013.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Feedback from CEM Assessments: Individual Pupil Records & Predictions Belfast, March 6 th 2013 Neil Defty Business & Development Manager CEM
Year 8 Curriculum Evening
Types of School Value-Added Reports
SATIS - Student Attitude Questionnaires for Years 7 to 10 Students
CEM (NZ) Centre for Evaluation & Monitoring College of Education
Target Setting at KS3 and KS4
CEM (NZ) Centre for Evaluation & Monitoring College of Education
Year 7 Curriculum Evening
Data Literacy Survey results and Data Protocols
Introduction to CEM Secondary Pre-16 Information Systems
The MidYIS Test.
Responding to Recent Debates in Education: Review of KS2 Testing, Assessment & Accountability
Year 7 Curriculum Evening
SATIS - Student Attitude Questionnaires for Years 7 to 10 Students
Course: CEM Information Systems for Beginners and New Users
Why do we assess?.
Responding to Recent Debates in Education: Review of KS2 Testing, Assessment & Accountability
Using CEM data for T and L
Target Setting and Monitoring
Presentation transcript:

CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury Christchurch Using MidYIS9 to inform Teaching & Learning

The Centre for Evaluation and Monitoring Established in 1999 Part of Canterbury University Each year about 300 NZ schools and students use our services. Evaluates and monitors progress of students, subjects and schools to enable evidence based decision making. Provides Entrance Tests to assess mathematics, English and Reasoning Skills for class placement and identifying strengths and areas for development. Administers student attitude and engagement surveys to investigate factors that can affect progress.

Why use MidYIS9? League tables are an unfair comparison of schools and do not recognise the impact a school has on student progress. Unique value added analysis. Comprehensive on line easy to understand and share feedback eg subject, HOD and staff meetings and BOT. Sound data for evidence based educational decision making at the student, subject and school levels. An independent measure of school effectiveness. Based on international research adapted for NZ.

Provides value-added measurement from the beginning of Year 9 to the end of Year 11 in about 20 different subject areas. Students sit a baseline MidYIS test at the beginning of Year 9. The first diagnostic feedback is provided to schools and targets are provided based on national data from the previous year. At the end of Year 11, CEM receives NCEA results from NZQA. The predicted scores are compared to the actual NCEA scores and the value added is calculated. Comprehensive online feedback is provided to analyse the performance of: The individual student The subject cohort The school MidYIS Yr 9 Middle Years Information System

What does MidYIS9 measure? Literacy important to the prediction of all subjects, especially English, foreign languages and language-related subjects such as history. Numeracy important for prediction to mathematics and mathematics-related subjects such as design, technology, science and economics. Reasoning skills how well students undertake tasks involving 3-D visualisation, spatial aptitude and pattern recognition. Non verbal skills are particularly important for predicting to a range of subjects that includes mathematics, visual arts, drama and design. Processing speed and accuracy proof reading and perceptual speed and accuracy.

MidYIS scores are standardised on a nationally representative sample to have a mean score of 100 and a standard deviation of 15. The scores for each sub test and the overall score are normally distributed. The average pupil in the nationally representative sample will score % of pupils score within about two standard deviations of the mean, which means 95% of pupils will score between 70 and 130 Pupils scoring over 130 are in the top 2½ percent of the national sample and meet the traditional definition of being gifted Pupils scoring less than 70 are in the lowest 2½ percent of the national sample and may have special educational needs The first feedback

Information from the IPR: Overall the pupil is a fairly average pupil but her mathematical ability is significantly less than her Vocabulary (which is well above average). This might suggest there will be some underachievement in mathematics and in parts of subjects where mathematical skills are required, notably science. The best score is in Non-verbal ability where, like in Vocabulary, the pupil has scores in the Band A, the top 25% of all pupils nationally. Her Skills score is roughly average and although not significantly, is lower than her Vocabulary. This might affect her performance in examinations. Interpreting the IPR

Comprehensive online feedback is provided at the: Student Level Subject Level School Level The second ‘Value Added” feedback

How is the value added score calculated? Year 11 NCEA results are combined into a single number (the NCEA discrimination score) using a simple algorithm and plotted against the MidYIS Year 9 results to establish a national regression line which represents the predicted expected performance. The individual student’s NCEA discrimination score is plotted on the same graph. The vertical distance between this score and the regression line represents the value added scores. Value added scores are standardised:

The concept of Value Added Beyond Expectation +ve Value- Added In line with Expectation 0 Value-Added Below Expectation -ve Value-Added curinstassess

The Student Report

The Subject Report

The School Report

The comprehensive on-line feedback can be used to: Identify student strengths and weaknesses and set realistic and motivational targets; Identify students requiring support at the individual subject level or across a range of subjects; Identify and share best practice and measure teachers’ influence on the academic growth rates of students; Identify high achievers or under-aspirers; Inform departmental planning and the School Improvement Plan; Determine where curriculum and instruction is having the greatest impact on student learning; Evaluate the effectiveness of programs. How can schools use the feedback?

What about SES? Each student serves as his or her control and thus eliminates factors like socio economic status that may affect the learning outcomes of students. “Given similar student abilities, the choice of school a student attends is not among the more powerful influences on later achievement.” Hattie, J. (2002) Schools Like Mine: Cluster Analysis of New Zealand Schools Technical Report 14, Project asTTle, University of Auckland

Other services provided by CEM Student attitude surveys Student Welfare & Safety Non-academic activities Support Social and personal development Extent of Bullying Extent of Racism Healthy Lifestyles School entrance tests Mathematics, English and Reasoning skills YeLIS and SeLIS