Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011.

Similar presentations


Presentation on theme: "Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011."— Presentation transcript:

1 Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011

2 Ensuring Fairness

3 Principles of Fair Analysis : 1.Use an Appropriate Baseline 2.Compare ‘Like’ with ‘Like’ 3.Reflect Statistical Uncertainty

4 CEM Systems Principle of Fair Analysis No1 : Use an Appropriate Baseline

5 The Projects Year 7 Year 8 (+ additional) Year 9 MidYIS Computer Adaptive Baseline Test or paper-based test Year 10 Year 11 Yellis Computer Adaptive Baseline Test or paper-based test Year 12 Year 13 Alis GCSE Computer Adaptive Baseline Test (paper test also available) GCSE A / AS / Btec / IB etcINSIGHT Combines curriculum tests with developed ability KS3KS4

6 The Assessments Year Groups WhenDeliveryIncludes MidYIS 789789 Term 1 + catch ups Paper or Computer Adaptive 1 session Developed Ability Vocabulary Maths Non Verbal Skills Yellis 10 11 Term 1 + catch ups Paper or Computer Adaptive 1 session Developed Ability Vocabulary Maths Non Verbal INSIGHT 9 4 week testing window mid April – mid May + catch ups Computer Adaptive 3 or 4 sessions Curriculum-based Reading Maths Science + Attitudes & Developed Ability

7 The Assessments INSIGHT – curriculum-based Assessment Maths - Number & Algebra - Handling Data & Space - Shapes & Measures Science - Biology - Chemistry - Physics - + Attitudes to Science Reading - Speed Reading - Text Comprehension - Passage Comprehension Additionally for INSIGHT: Developed - Vocabulary, Ability Non verbal, Skills - Attitudinal measures

8 What is Computer Adaptive Assessment? Questions adapt to pupil Efficient No time Wasting Wider Ability Range More Enjoyable Green The Assessments

9 Computer AdaptivePaper-based Number of Students per Test Session Limited by number of computers available – multiple testing sessions Can test all students in a single session (in hall or in form groups) or in more than the one session Cost Roughly 30% cheaper than paper-based test Standard cost Processing of Baseline Feedback Baseline feedback available within a couple of hours of testing Takes around 2-4 weeks for papers to be marked Preparation Must be able to install the software or access the internet version of the test No pre-test set up Student Experience “Tailored” assessmentAll students see all questions, irrespective of suitability Computer-adaptive vs Paper-based testing

10 The Analysis

11 . 50100150 -ve VA +ve VA Regression Line (…Trend Line, Line of Best Fit) Outcome = gradient x baseline + intercept Correlation Coefficient (~ 0.7) Residuals Subject X Linear Least Squares Regression

12 . 50100150 Subject X Making Predictions e.g. MidYIS, INSIGHT, Yellis standardised scores e.g. GCSE A* B C D E A F G U

13 Some Subjects are More Equal than Others…. Principle of Fair Analysis No2 : Compare ‘Like’ with ‘Like’ 1.5 grades’ difference!

14 The Assessments

15 Developed Ability - Maths

16 Developed Ability - Vocabulary

17 Developed Ability - Non-verbal

18 Developed Ability - Skills

19 INSIGHT - Maths

20 INSIGHT - Science

21 INSIGHT - Reading

22 Baseline Assessment and Predictive Feedback

23 Baseline Feedback Nationally-Standardised Feedback How did your pupils perform on the assessment? What strengths and weaknesses do they have? As a group how able are they? Predictions Given their performances on the test, how well might they do at KS3 or GCSE?

24 Baseline Feedback Feedback can be used at the pupil, class and cohort level. - to guide individuals - to monitor pupil progress - to monitor subject-wide and department level progress For classroom teachers, Head teachers or SMT as a quality assurance tool. Data can be aggregated at other levels. We support & provide software tools to help schools to do this e.g. Paris software.

25 Baseline Feedback – Test Scores

26 · National Mean =100, Standard Deviation =15 · 4 Performance Bands A, B, C, D · Averages & Band Profiles for the cohort · 95% of scores lie between 70 & 130 · No ceiling at 130+ or floor at 70

27 Baseline Feedback – Band Profile Graphs

28 Baseline Feedback-Gifted Pupils Standardised Test Score

29 Baseline Feedback Individual Pupil Recordsheets (IPRs)

30 Predictive Feedback Predictions…... Average performance by similar pupils in past examinations

31 Predictive Feedback

32

33 English Language - band D 2 10 23 31 24 9 1 00 0 10 20 30 40 50 UGFEDCBAA* Grade Percent English Language - band C 1 2 6 20 35 29 7 1 0 0 10 20 30 40 50 UGFEDCBAA* Grade Percent English Language - band B 00 2 7 24 40 21 5 0 0 10 20 30 40 50 UGFEDCBAA* Grade Percent English Language - band A 000 1 8 26 35 23 7 0 10 20 30 40 50 UGFEDCBAA* Grade Percent Predictive Feedback- Chances Graphs

34 Predictive Feedback- Individual Chances Graphs 30% chance of a grade D – the most likely single grade. 70% chance of a different grade Point Prediction = 3.8 Chances Graphs based on Pupil’s actual Test Score NOT Band

35 Chances Graphs The Chances graphs show that, from almost any baseline score, students come up with almost any grade - - - there are just different probabilities for each grade depending on the baseline score. In working with students these graphs are more useful than a single predicted or target grade Chances graphs serve as –a warning for top scoring students and –a motivator for low scoring students

36 Value Added Feedback

37 For each subject, answer questions: Given their abilities, have pupils done better or worse than expected? Can we draw any conclusions at the department level?

38 Value Added Feedback For each pupil in each subject: Raw residual = Achieved – predicted Standardised residual – allows fair comparison between different subjects and years At subject level: Confidence bounds are narrower with more pupils If average standardised residual lies within bounds you cannot draw any conclusions If average standardised residual lies outside bounds you can be confident that something significant is happening in that subject.

39 Burning Question : What is my Value-Added Score ? Better Question : Is it Important ? Principle of Fair Analysis No3 : Reflect Statistical Uncertainty Value Added Feedback

40 Value Added Feedback – Scatter Plot GCSE English GCSE Points Equivalent Baseline Score Look for Patterns… General under- or over-achievement ? Do any groups of students stand out ? – high ability vs low ability ? – male vs female ?

41 Value Added Feedback Year 7 Pupil Level Residuals to GCSE

42 Value Added Feedback – Standardised Residuals Graph Standardised Residuals shown with confidence limits at 2 (95%) and 3 (99.7%) standard deviations Standardised Residuals can be compared fairly between subjects and over years

43 Value Added Feedback - Statistical Process Control (SPC) Chart Subject: X

44 Attitudinal Surveys

45 Attitudinal Feedback Your data is above the average Your data is below the average Your data is about the same as the average

46 Attitudinal Feedback

47 Secondary Pre-16 Contact Details Tel:0191 334 4255 Email:secondary.support@cem.dur.ac.uk Web:www.cemcentre.org


Download ppt "Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011."

Similar presentations


Ads by Google