Presentation is loading. Please wait.

Presentation is loading. Please wait.

CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.

Similar presentations


Presentation on theme: "CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury."— Presentation transcript:

1 CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury Christchurch John.boereboom@canterbury.ac.nz Using MidYIS to inform Teaching & Learning

2 The Centre for Evaluation and Monitoring Established in 1999 Part of Canterbury University Each year about 300 NZ schools and 70 000 students use our services. Evaluates and monitors progress of students, subjects and schools to enable evidence based decision making. Provides Entrance Tests to assess mathematics, English and Reasoning Skills for class placement and identifying strengths and areas for development. Administers student attitude and engagement surveys to investigate factors that can affect progress.

3 Why use MidYIS? League tables are an unfair comparison of schools and do not recognise the impact a school has on student progress. Unique value added analysis. Comprehensive on line easy to understand and share feedback eg subject, HOD and staff meetings and BOT. Sound data for evidence based educational decision making at the student, subject and school levels. An independent measure of school effectiveness. Based on international research adapted for NZ.

4 Provides value-added measurement from the beginning of Year 9 to the end of Year 11 in about 20 different subject areas. Students sit a baseline MidYIS test at the beginning of Year 9. The first diagnostic feedback is provided to schools and targets are provided based on national data from the previous year. At the end of Year 11, CEM receives NCEA results from NZQA. The predicted scores are compared to the actual NCEA scores and the value added is calculated. Comprehensive online feedback is provided to analyse the performance of: The individual student The subject cohort The school MidYIS Yr 9 Middle Years Information System

5 What does MidYIS measure? Vocabulary important to the prediction of all subjects, especially English, foreign languages and language-related subjects such as history. Mathematics important for prediction to mathematics and mathematics-related subjects such as design, technology, science and economics. Non Verbal (Visual) how well students undertake tasks involving 3-D visualisation, spatial aptitude and pattern recognition. Non verbal skills are particularly important for predicting to a range of subjects that includes mathematics, visual arts, drama and design. Skills (Accuracy) proof reading and perceptual speed and accuracy.

6 If a student is performing exactly in line with the average of the students who participated in the MidYIS 9 baseline test then they will have a score of 100. A score greater than 100 indicates that the student performed better than average. A score less than 100 means the student performed below average. A score of 130 is significantly high (in the top 2.5% of students in MidYIS 9) and a score of 70 is significantly low (in the bottom 2.5% of students in MidYIS 9). Average = 100 Standard deviation = 15 Targets are based on what is expected from students if they perform similar to students of similar ability the previous year. The first feedback

7 Comprehensive online feedback is provided at the: Student Level Subject Level School Level The second ‘Value Added” feedback http://www.atlanta.k12.ga.us

8 How is the value added score calculated? Year 11 NCEA results are combined into a single number (the NCEA discrimination score) using a simple algorithm and plotted against the MidYIS Year 9 results to establish a national regression line which represents the predicted expected performance. The individual student’s NCEA discrimination score is plotted on the same graph. The vertical distance between this score and the regression line represents the value added scores. Value added scores are standardised:

9 The concept of Value Added Beyond Expectation +ve Value- Added In line with Expectation 0 Value-Added Below Expectation -ve Value-Added http://www.slideshare.net/RobertLeneway/acl- curinstassess

10 The Student Report

11 The Subject Report

12 The School Report

13 The comprehensive on-line feedback can be used to: Identify student strengths and weaknesses and set realistic and motivational targets; Identify students requiring support at the individual subject level or across a range of subjects; Identify and share best practice and measure teachers’ influence on the academic growth rates of students; Identify high achievers or under-aspirers; Inform departmental planning and the School Improvement Plan; Determine where curriculum and instruction is having the greatest impact on student learning; Evaluate the effectiveness of programs. How can schools use the feedback?

14 What about SES? Each student serves as his or her control and thus eliminates factors like socio economic status that may affect the learning outcomes of students. “Given similar student abilities, the choice of school a student attends is not among the more powerful influences on later achievement.” Hattie, J. (2002) Schools Like Mine: Cluster Analysis of New Zealand Schools Technical Report 14, Project asTTle, University of Auckland

15 Other services provided by CEM Student attitude surveys Student Welfare & Safety Non-academic activities Support Social and personal development Extent of Bullying Extent of Racism Healthy Lifestyles School entrance tests Mathematics, English and Reasoning skills YeLIS and SeLIS


Download ppt "CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury."

Similar presentations


Ads by Google