Presentation is loading. Please wait.

Presentation is loading. Please wait.

Advanced Users – MidYIS, Yellis & ALIS

Similar presentations


Presentation on theme: "Advanced Users – MidYIS, Yellis & ALIS"— Presentation transcript:

1 Advanced Users – MidYIS, Yellis & ALIS
Durham 2013

2 Understanding the Students Introduction to the Test Data

3 Underlying Principle If we measure a student’s ability we can determine ‘typical progress’ for the individual and use this to inform likely outcomes and against which to measure performance of individuals and groups. Q. How does this work ? Q. How do we measure and interpret ‘ability’ ? Q. How do we interpret the data fairly and reliably ?

4 Measuring and Interpreting Ability

5 Options Use Pre-existing qualification data Post-16 – Average GCSE
Use Baseline Test Post-16 and Pre-16 – Computer Adaptive Baseline Test Note: Issues regarding use of CABT alongside Average GCSE at Post-16 will be examined later in the day with predictive information.

6 Adaptive approach Low Average High

7 Baseline Test Standardisation
Test scores are standardised; Mean = 100, SD = 15 Standardised Score National Percentage Comment >130 Top 2.5% Traditional classification of ‘mentally gifted’ >120 Top 10% >100 Top 50% <80 Bottom 10% <70 Bottom 2.5% Potential special educational needs ??

8 A verag e Stanine 1 2 3 4 5 6 7 8 9 4% 7% 12% 17% 20% 17% 12% 7% 4% 50
4% 7% 12% 17% 20% 17% 12% 7% 4% A verag e Below Average Above Average Band A 25% Band D Band C Band B SEN ?? G & T ?? 50 60 70 80 90 100 110 120 130 140 150 Percentiles: Standardised Test Score 10 5 20 30 1 40 90 95 80 70 99 60 50

9 Cohort Ability Intake Profiles

10 Intake Profiles

11 Intake Profiles (Historical)

12 Student Ability IPRs

13 Individual Pupil Record Sheets (IPRs)
Look for sections that are inconsistent

14 Two students Same Ability Different Profiles

15 General IPR Patterns www.cem.org/midyisiprbooklet
Pupils with high scores across all components Pupils with low scores across all components Pupils with significant differences between one or two components Vocab lower than others Vocab higher than others Maths higher than others Maths lower than others Non-Verbal higher than others Non-Verbal lower than others Low Skills High Skills

16 Vocab significantly lower than other sections
English Second Language ? Understanding language used in learning and assessment ? Language enrichment ?

17 Vocab significantly higher than other sections
Good Communicator ? Work in class may not be to this standard Weak Non-verbal Weak Maths Weak Skills (speed of working ?) Many benefit from verbal descriptors ?

18 Maths significantly higher than other sections
Strong Maths ability Not 100% curriculum free May depend on prior teaching effectiveness Far East influence ?

19 Maths significantly lower than other sections
Implications not just for maths but other numerate or data- based subjects General poor numeracy ? Remedial Maths ?

20 Non-Verbal significantly higher than other sections
Good spatial and non-verbal ability May have high specific skills Low Vocab, Maths & Skills may indicate has difficulty communicating Frustration ?

21 Non-Verbal significantly lower than other sections
Difficulty understanding diagrams or graphical instructions ? Verbal explanation ? Physical demonstration ? Physical Models ?

22 Low Skills Scores Skills = Proof Reading and Perceptual Speed & Accuracy Speed of Working Work well in class / homework but underachieve in exams ? Problems checking work or decoding questions ? Low Skills + Low Vocab Poor written work in class (unable to work quickly) Dyslexia ? Further specialist assessment required

23 High Skills Scores Skills = Proof Reading and Perceptual Speed & Accuracy Can work quickly and accurately Difficulty communicating and expressing ideas ? May perform poorly in areas using numeracy skills and subjects needing 3D visualisation and spatial concepts ? May struggle in most areas of curriculum

24 Working with Individual Pupil Records (IPRs)

25 Objectives To gain understanding of interpreting IPRs To share strategies for supporting individual pupils Strategy To look first at interpretation of MidYIS IPRs Exercises with MidYIS IPRs Use generic patterns to apply to exercises on IPRs with ALIS, Yellis and INSIGHT though there are slight differences

26 What does the MidYIS test measure?
Vocabulary Most culturally linked. Affects all subjects but most important in English, History and some Foreign Languages. Measures fluency rather than knowledge. Maths The Maths score is well correlated with most subjects but is particularly important when predicting Maths, Statistics, ICT, Design & Technology and Economics. Non-verbal Tests 3D visualisation, spatial aptitude, pattern recognition and logical thinking. Important when predicting Maths, Science, Design & Technology, Geography, Art and Drama. Skills Tests proof reading skills (SPG) and perceptual speed and accuracy (e.g. matching symbols under time pressure). Measures fluency and speed necessary in exams and in the work place. Relies on a pupil’s scanning and skimming skills.

27 Using MidYIS IPRs to Inform Teaching and Learning
The IPR on its own simply tells us about the relative performances of the pupil on the separate sections of the test, where the pupil is strong, where performance has been significantly above or below national averages or where the pupil has significantly outperformed in one section or another. It is when the IPR is placed in the hands of a teacher who knows that pupil that it becomes a powerful tool. It is what teachers know about individual pupils: what has happened in the past, how they respond to given situations and how they work in the teacher’s specific subject that inform the interpretation of the IPR. If the IPR data from MidYIS, the teacher’s personal and subject specific knowledge and experiences regarding the pupil can be shared, then there becomes a much more powerful instrument for supporting pupils’ learning needs.

28 Examples of Individual Pupil Profiles
For each example look at the information contained in the graph the issues that may arise for this pupil in your subject strategies you could employ to support that pupil (either for the whole class or for that specific individual)

29 Student A Low vocab 29

30 Strategies for Student A
Word banks for each topic Practise writing with words rather than symbols e.g. To find the common denominator, first of all you ... Discussion groups (although ensure pupils with low vocabulary scores do not all congregate) Wider reading Visits/trips etc. to enrich language and cultural experience

31 Student B Low Non-Verbal – NRIT – Coached for test? 31

32 Strategies for Student B
May struggle to understand diagrams – use spoken and written explanations, paired work or group work to interpret Physical/practical/kinesthetic explanations may help (e.g. modelling solar system with clay/string or demonstrating distance between planets on football pitch etc.) Use drama/active methods to demonstrate difficult concepts 32

33 Student C High Vocab – 33

34 Strategies for Student C
Pupil may seem more able than is the case, e.g. ‘talks a good talk’ Allow paired work or group discussion to communicate answers orally Describe maths problems Encourage leadership roles as well as debates/drama Support with scaffolding/writing frames etc. 34

35 Student D Low Skills – Aobhinn Wood Rory O Brian Joel McNeill
Sarah Robinson Patrick Johnson Ben McAnoy Isaac Hart 35

36 Strategies for Student D
Analysis A pupil like this may: struggle to proof read his work, therefore achieve a lower grade than he seems capable of struggle to interpret or understand exam questions either work slowly with more accuracy OR work quickly with less accuracy – result is similar i.e. lower test score than expected Strategies: allow extra time practise timing e.g. clock on IWB use a range of question words to develop ability to understand instructions develop proof reading technique e.g. spotting comon errors consider further testing for dyslexia

37 Some pupil data MIDYIS 2009 Score Band Pupil 01 122 A 125 116 107 B
Vocabulary Maths Non-Verbal Skills Overall Score Band Pupil 01 122 A 125 116 107 B 126 Pupil 02 105 110 127 95 C 108 Pupil 03 93 89 D 99 Pupil 04 91 130 115 103 Pupil 05 111 144 129 Pupil 06 112 85 97 109 Pupil 07 106 100 86 Pupil 08 141 137 132 135 143 Pupil 09 104 92 98 Pupil 10 119 114 Pupil 11 140 118 Pupil 12 123 120 Pupil 13 96 Pupil 14 Pupil 15

38 The class from Waterloo Road
A useful quick reference for staff

39 A Selection Of MidYIS Scores For ‘Waterloo Road’
Why would this be a very challenging class to teach? Vocabulary Maths Non Verbal Skills MidYIS Score St. Score Band Surname Sex A F 81 D 110 B 108 112 94 C 128 107 105 120 M 106 121 103 90 114 84 96 E 130 91 92 86 74 G 100 115 80 H 111 I 123 95 J 99 102 K 132 131 133 L 76 70 73 71 What do I need to know/do to teach this (difficult) class of twelve pupils? These are real anonymous scores from a number of schools around the UK

40 IPR Patterns – A Summary
Vocabulary scores significantly lower than other component scores Second language? Deprived areas? Difficulty accessing curriculum.? Targeted help does work. Seen in nearly all schools. Worth further diagnosis. Could potentially affect performance in all subjects. Vocabulary scores significantly higher than other component scores Good communicators. Get on. Put Maths problems in words? Mathematics significantly higher than other scores From Far East? Done entrance tests? Primary experience? Mathematics significantly lower than other scores Primary experience. Use words and diagrams? Sometimes difficult to change attitude… Difficulties with logical thinking and skills such as sequencing. Low Mathematics scores with high Non-verbal Scores Use diagrams. Confidence building often needed. Pupils with non-verbal scores different from others (High) Frustration? Behaviour problems? Don’t do as well as good communicators or numerate pupils? Good at 3D and 3D to 2D visualisation and spatial awareness. Good at extracting information from visual images. Pupils with non verbal scores different from others (Low) - Peak at GCSE? A level ? Pupils with low Skills scores - Exams a difficulty after good coursework? Suggests slow speed of processing. High Skills Scores - Do well in exams compared with classwork? The Average Pupil - They do exist! High scores throughout - Above a score of 130 puts the pupil in the top 2% nationally Low scores throughout - Below a score of 70 puts the pupil in the bottom 2% nationally

41 Interpreting IPRs Exercises
Have a look at the IPRs on the following pages. These show examples for Yellis (Year 10) and ALIS (Year 12) as well as MidYIS. What do the scores suggest about the students and how would you use this information to aid the teaching and learning process for each of them?

42 1 2

43 3 4

44 5

45 6

46 7 Proof-Reading 88 PSA 108

47 8 Yellis

48 Case Study 1 You are given data relating to an institution where students completed the ALIS computer adaptive test. They are chosen because they show significant differences between the various parts of the test. Remember scores are standardised around 100. Name Overall Vocab Maths Non Verbal Average A Level subjects chosen St.Score Band GCSE A 78 D 49 99 B 92 C na Biology, Maths, Business, Art 94 115 85 104 Biology, Business, Psychology, English 88 97 5.6 History, Psychology, English, Media 101 107 80 5.9 Business, History, English, Drama E 87 112 116 Biology, Physics, Maths, Business F 81 47 103 111 Maths, Further Maths, Business G 93 113 84 Biology, Business, French, Geography H 89 7 Art, English, Psychology, Religious St. I 68 100 109 5.4 Maths, Geography, French, Music J 105 67 124 6.1 Maths, Further Maths, Psychology, Economics K 96 71 110 Biology, Maths, Art, English L 60 Maths, History Religious St., English a) Are there any apparent mismatches between the subjects being followed and this data? b) What support can be given to those students who have weaknesses in Vocabulary or Mathematics? c) How might predictions made for these students be tempered in the light of the inconsistencies in the test components and missing average GCSE points scores? 48

49 What are the strengths and weaknesses of this A/AS level student?
Case Study 2 What are the strengths and weaknesses of this A/AS level student? To use the IPR (Individual pupil record) familiarise yourself with the terms standard score, band, stanine, percentile and confidence band. Which AS/A level subjects might be avoided? b) This student chose English, Film Studies, Music Technology and Psychology. Is this a good choice? Do you foresee any problems? 49

50 INSIGHT Pupil IPR Comments? 50

51 Maths

52 Introduction to ‘Predictions’
Looking Forwards Introduction to ‘Predictions’

53 Theory

54 How CEM ‘Predictions’ are made…
Subject X A* A B C D E A* / A C

55 Some Subjects are More Equal than Others…. A-Levels
D E >1 grade

56 Some Subjects are More Equal than Others …
Performance varies between subjects, thus analysing and predicting each subject individually is essential. e.g. Student with Average GCSE = 6.0 Subject Choices Predicted Grades Maths, Physics, Chemistry, Biology C, C/D, C/D, C/D Sociology, RS, Drama, Media B, B/C, B/C, B/C

57 Some Subjects are More Equal than Others … GCSE
F E D C B A A* Test Score GCSE Grades Art & Design Biology Chemistry Economics English French Geography German History Ict Mathematics Media Studies Music Physical Education Physics Religious Studies Science (Double) Spanish 1 grade

58 Feedback

59 Predictions – MidYIS example
B C/D D 5.0 6.0 4.4 4.8 3.9 Similar spreadsheets available from Yellis, INSIGHT

60 Adjusting Predictions in MidYIS / Yellis / INSIGHT
0.5 6.3 6.9 5.8 6.1 5.5

61 Chances Graphs Most likely grade
2 3 10 23 33 5 1 15 20 25 30 35 40 U G F E D C B A A* Percent Grade Individual Chances Graph for Student no.5 - GCSE English MidYIS Score 82 MidYIS Band D Prediction/expected grade: 3.8 grade D Most likely grade

62 Post-16 : CABT vs Average GCSE
Average GCSE correlates very well to A-level / IB etc, but by itself is not sufficient…. What is a GCSE ? Students without GCSE ? Years out between GCSE & A-level ? Reliability of GCSE ? Prior Value-Added ?

63 The Effect of Prior Value Added In line with Expectation
Beyond Expectation +ve Value-Added In line with Expectation 0 Value-Added Below Expectation -ve Value-Added Average GCSE = 6 Do these 3 students all have the same ability ?

64 Rationale for CABT in addition to GCSE
Do students with the same GCSE score from feeder schools with differing value-added have the same ability ? How can you tell if a student has underachieved at GCSE and thus can you maximise their potential ? Has a student got very good GCSE scores through the school effort rather than their ability alone ? How will this affect expectation of attainment in the Sixth Form ? Can you add value at every Key Stage ? Baseline testing provides a measure of ability that (to a large extent) is independent of the effect of prior treatment.

65 ‘Predictions’ Predictions Based on GCSE
Predictions Based on Baseline Test Probability of achieving each grade Expected Grade

66 Which predicted grades are the most appropriate for this student ?

67 Adjusting Predictions in ALIS (Paris Software)
Step 1 Adjusting Predictions in ALIS (Paris Software) 75th Percentile Prior Value-Added

68 Working with ‘Predictions’
(Average performance by similar pupils in previous years)

69 Objectives To gain understanding of the interpretation of ‘predictions’ Remembering that they are not really PREDICTIONS but part of a ‘chances scenario’ Using chances to explore the setting of targets Discussion of monitoring performance against targets

70 Point and grade ‘predictions’ to GCSE
WHY ARE THE SUBJECT PREDICTIONS DIFFERENT? Concentrate on student 4

71 Prediction/expected grade: 5.1 grade C
Most likely grade What are the chances a) of getting a grade C or above ? b) of not getting a C ?

72 Yellis predictive data: baseline score 103 (55%)
Comment?

73 Chances graphs MidYIS and Yellis
Situation You are a tutor to a Year 10 pupil and you wish to help him/her to set target grades. Here is a chances graph based on the pupil’s Year 7 MidYIS test (114) and one based on the Year 10 Yellis test (58%) MidYIS Chances Graph Yellis Chances Graph This graph is based on the pupil’s exact MidYIS score, adjusted to include the school’s previous value-added performance. This graph is based on one ability band and has no value-added adjustment.

74 a) What do the graphs tell you about this pupil’s GCSE chances in this subject (Maths)?
b) What could account for the differences between the two graphs and are these important? IMPORTANT FOR STAFF AND STUDENTS TO UNDERSTAND THE DIFFERENCE Fixed Mindset: [My intelligence is fixed and tests tell me how clever I am.] This graph tells me I’m going to get a B, but I thought I was going to get an A. I’m obviously not as clever as I hoped I was and so the A and A* grades I’ve got for my work so far can’t really be true. Growth Mindset: [My intelligence can develop and tests tell me how far I have got.] This tells me that most people with the same MidYIS score as me achieved a B last year, but I think I have a good chance of an A and I know that my work has been about that level so far so I must be doing well. What do I need to do to be one of the 10% who gets an A*? How was this information produced? The MidYIS graphs are produced using the predictions spreadsheet. Select the pupil(s) and subject(s) to display or print using the GCSE Pupil Summary 1 tab. Adjustments for value-added can be made for individual subjects on the GCSE Preds tab. The Yellis graphs for all GCSE subjects (showing all four ability bands) can be downloaded from the Secondary+ website.

75 Commentary From MidYIS - The most likely grade is a B (35%) but remember there is a 65% (100-65) chance of getting a different grade but also a 75% ( ) chance of the top three grades. From Yellis - The most likely grade appears to be a C but remember that the band has been decided over a range, not for the individual student and this pupils score is near the top of that range, 58 compared with It has also not been adjusted for this school’s prior value added in the past. In an interview with the student one has to use your professional judgement about that student, taking everything into account. Certainly the Yellis chart warns against complacency, but if the school has a strong value added history it is better to rely in this case on the MidYIS chart for negotiating a target. Grade A is a fair aspirational target for the student but accountability for a teacher cannot fairly be judged by not achieving this grade with this student. Even a very good teacher may only achieve B or C with this student. Can the aspirational target set for the student be the same as that used for staff accountability purposes? There is a trap here.

76 ALIS You are the subject teacher and are discussing possible A2 target grades with individual students. You are about to talk to Jonathan who achieved an average GCSE score of This gives a statistical prediction=28.35x = 77 UCAS points using the regression formula at A2 for this subject (Grade C at A2). Assume that the computer adaptive baseline test confirms this prediction. Chances graphs for this subject are shown showing the percentage of students with similar profiles achieving the various grades. Individual chances graph for Jonathan

77 a) Why are these two chances graphs different?
(b) ‘Most candidates with Jonathan’s GCSE background score achieved a C in my subject last year so Jonathan’s target grade should be a C’. What are the weaknesses of this statement? (c) What other factors should be taken into consideration apart from chances graph data, when determining a target grade?

78 The difference in the chances graphs is that one of them provides for a range of GCSE scores whilst the other is linked to Jonathan’s individual average GCSE score of The strength of the chances graph is that it shows more than a bald prediction. True, most students starting from an average GCSE score like Jonathan did achieve a C grade at A2 in examinations for this subject. However the probability of a B grade is also high since his score was not at the bottom of this range. This might be reflected too if the department also has a history of high prior value added. The converse is also true with a D grade probability warning against complacency. Students are not robots who will always fit with statistics so it is dangerous to make sweeping statements based on one set of results. As well as looking at the prediction you should use the chances graph as a starting point, with your professional judgement taking into account factors such as his and the department’s previous performance in the subject, his attitude to work, what he is likely to achieve based on your own experience. You might want to start with the most popular outcome grade C and use your judgement to decide how far up (or down!) to go. He may be a very committed student and if the department has achieved high value added in the past, an A/B grade may be more appropriate though A* looks unlikely. If you are using aspirational targets for psychological reasons with students then A may be appropriate even though it less probable than B/C.

79 Key Questions for Intelligent Target Setting
What type of valid and reliable predictive data should be used to set the targets? Should students be involved as part of the process (ownership, empowerment etc.)? Should parents be informed of the process and outcome? 79

80 Key points to consider might include:
Where has the data come from? What (reliable and relevant) data should we use? Enabling colleagues to trust the data: Training (staff) Communication with parents and students Challenging, NOT Demoralising, students……. Storage and retrieval of data Consistency of understanding what the data means and does not mean The process of setting targets is crucial…….

81 There is wide-ranging practice using CEM data to set student, department and institution targets. Increasingly sophisticated methods are used by schools and colleges. The simplest model is to use the student grade predictions. These then become the targets against which student progress and achievement can be monitored. Theoretically, if these targets were to be met, residuals would be zero so overall progress would be average. The school/college would be at the 50th percentile.

82 More challenging targets would be those based on the basis of history
More challenging targets would be those based on the basis of history. For example. Where is the school/college now? Where is your subject now? If your subject value added history shows that performance is in the upper quartile it may be sensible to adjust targets. This may have the effect of raising point predictions between of a grade. This would be a useful starting point, but it would not be advisable to use the predictions for below average subjects, which might lead to continuing under-achievement.

83 Yellis Predictions For Modelling
FOUR approaches YELLIS GCSE Predictions YELLIS GCSE Predictions + say 0.5 a grade Prior value added analysis based on 3 year VA per department 75th percentile (upper quartile) analysis

84

85

86

87

88 On the next page are her A level predictions and chances graphs.
Case Study Here is the Individual Pupil Record from the ALIS computer adaptive test taken in Year 12 for a current Year 13 student. This student had a high positive value added in every GCSE subject as measured using MidYIS as a baseline. ( Average GCSE score 7.44) On the next page are her A level predictions and chances graphs. Why are the predictions different? Are the chances graphs useful here? 88

89 Predictions and chances graphs
Using PARIS software and tweaking the predictions for prior value added by these subjects, then from a GCSE baseline A*s are predicted in three of the four. If we did the same for the adaptive test baseline solid Bs might be predicted in all three. It is also worth looking at the value added at GCSE. See commentary 89

90 Commentary The differences in prediction from the GCSE baseline and the computer adaptive test for some students are interesting and these can be in either direction. Here there has been a very large value added at GCSE which may or may not be sustainable at A level. This student’s history is shown below: The value added here at GCSE is between 1 and 2 grades (for all institution data at year 7) and significantly positive for subjects (for the Independent school data from year 9). Actually if we measure this student’s value added from an average GCSE score of 7.44 next year, it does not tell the whole story. We need to look as well at the value added from the computer adaptive test too. The chances graphs should be used with extreme caution here and the growth mindset is vital if used with students. 90

91 Case study : setting departmental targets
Uses valid and reliable data e.g. chances graphs Involves sharing data with the students Gives ownership of the learning to the student Enables a shared responsibility between student, parent(s)/guardian, and the teacher Encourages professional judgement Leads to the teachers working smarter and not harder Leads to students being challenged and not ‘over supported’, thus becoming independent learners…

92 92

93 93

94 Most likely grade Prediction/expected grade: 5.4 grade B/C
Student no.1 GCSE Geography Prediction/expected grade: 5.4 grade B/C Most likely grade

95 Most likely grade Prediction/expected grade: 6.2 grade B
Student no.1 GCSE Geography Prediction/expected grade: 6.2 grade B Most likely grade

96 COMMENTS? 96

97 Monitoring Student Progress
Monitoring students’ work against target grades is established practice in schools and colleges, and there are many diverse monitoring systems in place Simple monitoring systems can be very effective Current student achievement compared to the target grade done at predetermined regular intervals to coincide with, for example internal assessments/examinations Designated staff having an overview of each student’s achievements across subjects All parents being informed of progress compared to targets Review of progress between parents and staff Subject progress being monitored by a member of the management team in conjunction with the head of subject/department A tracking system to show progress over time for subjects and students

98 Monitoring Progress: Schools and departments use various monitoring systems for comparing present progress with either the target grade or in some cases the minimum acceptable grade or basic suggested grade. Six examples from schools are shown. If you were Polly Bolton’s Form Teacher, how would you approach a discussion with her parents at a Parents’ Evening? Should parents be told the baseline scores?

99

100 Subjects

101 Tracking at departmental level for one student

102 Traditional mark book approach

103 Targets for learning…. reporting to pupils

104

105 Not a label for life ... just another piece of information
The Chances graphs show that, from almost any baseline score, students come up with almost any grade there are just different probabilities for each grade depending on the baseline score In working with students these graphs are more useful than a single predicted or target grade Chances graphs show what can be achieved: By students of similar ability By students with lower baseline scores 105

106 Student 1

107 Student 2

108 Student 3

109 Student 4 109

110 Student 4 - IPR

111 Performance Monitoring Introduction to Value-Added

112 Theory

113 How CEM ‘Value-Added’ is calculated…
Subject X -ve VA +ve VA Residuals VA

114 Burning Question : What is my Value-Added Score ? Better Question : Is it Important ?

115 Key Value Added Charts

116 1) SPC (Statistical Process Control) chart
VA Score Performance above expectation Good Practice to Share ? Performance in line with expectation Performance below expectation Problem with Teaching & Learning ? Year

117 2) Subject Bar Chart

118 3) Scatter Plot Religious Studies

119 General Underachievement ?
Scatter Plot Example 1 A2 – English Literature General Underachievement ?

120 Scatter Plot Example 2 A2 – English Literature Too many U’s ?

121 Other things to look for…
Why did these students do so badly ? Why did this student do so well ? How did they do in their other subjects ?

122 Post-16 : Impact of Baseline Choice on Value-Added

123 Same School - Spot the Difference ?
GCSE as Baseline Same School - Spot the Difference ? Test as Baseline

124 Does the Type of School make a Difference ?

125 Comparison to all schools
Comparison to Independent Schools Only

126 Comparison to FE Colleges Only
Comparison to all schools

127 Questions: How does the unit of comparison used affect the Value-Added data and what implications does this have on your understanding of performance ? Does this have implications for Self Evaluation ?

128 Using Value-Added Data

129 Necessary knowledge base to use CEM systems to their potential
1. The forms of Value Added Data: scatter graphs raw and standardised residuals SPC charts tables of data use of PARIS for further analyses (e.g. by gender, teaching group) 2. Predictive Data: point and grade predictions importance of chances graphs availability of different predictive data 3. Baseline Data band profile graphs IPRs average GCSE score computer adaptive tests 4. Attitudinal Data

130 If you have the tools you can use them to do these:
Make curriculum changes Adjust staffing structure and cater for student needs Self-evaluation procedures including the analysis of examination results using value added data The target setting process School and department development plans……. Improve your monitoring and reporting procedures Provide information to governors Have conversations with feeder primary schools Etc. etc.

131 Below are the value added charts from Yellis to GCSE for two contrasting institutions. Which subjects are outside the confidence limits in a ‘negative value added’ way? There must be questions to ask regarding teaching and learning? Which subjects are outside the confidence limits in a ‘positive value added’ way? Any result within the outer shaded area decreases the probability that the value added result is down to chance. The probability here is about 1 in 20. Outside the 99.7% confidence limit chance is less than 3 in a 1000. GCSE value added A challenging school GCSE value added A successful school

132 Here is a value-added subject report from a recent examination session at a school
GCSE score MidYIS score Write the equivalent GCSE grades next to the points scores. Compare the value-added performance of candidates scoring A*, B, and D grades. Which result would cause you to ask questions?

133 Compare the data for Student A and Student B
Find students A and B on each of the scatter graphs English and Maths

134 Scatter graph English Scatter graph Maths Is there anything to learn from these scatter graphs?

135 Common Scenario… Jane has completed her 6th form studies and a review has been received for her by the college after A level results. Choose one subject at a time and look carefully at what happened in that subject both from a baseline of average GCSE grades and from a baseline of the computer adaptive test. This student has been placed in different bands, band B from average GCSE score and band C from the computer adaptive test. This sometimes happens. It may be that the student had an off day when she did the computer adaptive test or it may be that there could have been a lot of ‘spoon feeding’ at GCSE. Jane may do better at coursework! Even though we may not know the cause it can act as a warning when analysing results though the predictions are not wildly out.

136 Profile Sheet: Jane (from Average GCSE)
Year: DOB: 01/06/89 (Average GCSE = 6.00 (Band B)) FINAL RESULTS PREDICTIONS Review: Final_Result Review Review Average Average Subject Points Grade Points Grade Residual Std. Residual (A1) Health & Social Care D C (A2) Religious Studies C B/C (A2) English Literature C B/C (A2) Drama & Theatre St B B/C STANDARDISED RAW

137 Chances Graphs - Band B from average GCSE

138 Individual Chances Graphs for Jane from average GCSE score

139 Profile Sheet: Jane (from Computer adaptive test)
Year: DOB: 01/06/89 (Online adaptive test = 0.11 (Band C)) FINAL RESULTS PREDICTIONS Review Average Average Subject Points Grade Points Grade Residual St (A1)Health & Social Care D D/E (A2) Religious Studies C C (A2) English Literature C C/D (A2) Drama & Theatre St B B/C RAW STANDARDISED

140 a) Did Jane reach her potential in all subjects?
b) Jane had been set aspirational targets prior to AS and A level by her teachers as below: Health and Social Care C Drama B Religious Studies B English B Were these reasonable target grades? c) Why should these grades not be used for accountability of her teachers?

141 Commentary Jane certainly reached her potential in Drama and Theatre Studies with positive standardised residuals by both methods. On the basis of the computer adaptive test she broadly reached potential in all subjects. On the basis of average GCSE two A level subjects were broadly down about half a grade and she dropped a grade in the AS. b) Hopefully you agree these were reasonable target grades. Remember we don’t know the student, but use the chances graphs and if these were aspirational grades for the student then accountability of a department’s staff on that basis is not appropriate, but it certainly is on the basis of a whole class’s average standardised residuals, particularly over a number of years.

142 You are looking at the attitudinal feedback from Year 10 over time:
YELLIS ATTITUDINAL You are looking at the attitudinal feedback from Year 10 over time:

143 Do you notice a pattern between the four charts
Do you notice a pattern between the four charts? There was an initiative in the school that contributed, but was it sustainable?

144 Yellis Further Comparison charts for English and Maths

145 What concerns you about these charts
What concerns you about these charts? Can you suggest which is the stronger department?

146 Departmental analysis (based on average GCSE score baseline)
SUBJECT A (A LEVEL) There may be lots of reasons for changes in performance but here one factor is known by the school. The subject teacher for subject A goes on long sick leave for one of the autumn terms. When do you think this happened? To help you the AS chart is shown below for this same subject. It may or may not be relevant. AS LEVEL A LEVEL -2.0 -1.0 0.0 1.0 2.0 2003 2004 2005 2006 2007 2008 2009 2010 2011 Exam Year Final_Result Average Standardised Residual -2.0 -1.0 0.0 1.0 2.0 2003 2004 2005 2006 2007 2008 2009 2010 2011 Average Standardised Residual Exam Year Final_Result

147 This school has a 5-year development plan which includes as one of its goals:
“To help students prepare for university and the world of work by developing independent learning skills, the ability to reflect and to learn from others and to maximise the benefits to learning offered by emerging technologies.” The graphs on the next page reflect students’ perceptions of the style of learning that has been adopted in their A-level classes in two broadly similar subjects. a) If the students’ perceptions are an accurate reflection of what takes place in the classroom, which subject seems more on board with the school’s development plan? b) How would these perceptions inform the Senior Management Team’s evaluation of progress with its 5-year plan if the subject achieving significantly better value-added results was i) Subject 1? ii) Subject 2?

148 Subject 2 Subject 1

149 Case study A: ALIS value-added data
Many sets of VAD are available! From average GCSE baseline: all ALIS cohort type of Institution syllabus Also the same from the baseline test 149

150 SPC Chart with confidence limits: WHOLE SCHOOL
Institution All ALIS Cohort Syllabus 150

151 Using PARIS software: Baseline Test
Whole School From your perspective, which set of VAD would you use for the different user groups? (Governors, HoDs, Media, Parents, SMT/SLT...) 151

152 USE ONE YEAR’S DATA WITH CAUTION!
Better to use three years’ data, as patterns over time are more significant. 152

153 Using data to inform leadership decisions
Some key questions: Which data do I need AND which data do I not need? (e.g. MidYIS cohort or Independent Sector) What does the data mean and what does the data not mean? (e.g. staff INSET and support) Who is the data for? Storage, retrieval and use of data (e.g. self-evaluation and preparing for Inspection) 153

154 The use of this data needs to allow us to do our best to help every student to at least achieve if not exceed their potential. It may challenge The culture of ‘my’ school/college Accountability policy Expectations Staff training in use of data and ability to cope with data (data overload) Integrating the data into school procedures, storage, retrieval, distribution and access Roles and Responsibilities 154

155 Who should data be shared with?

156 Colleagues Subject Teachers Heads of Department Pastoral Staff
Managers

157 Subject Teachers/HODs
This will be interpreted as a personalised prediction The data doesn’t work for this particular student You’re raising false expectation – he’ll never get that result You’re making us accountable for guaranteeing particular grades – when the pupils don’t get them we’ll get sacked and the school will get sued

158 Subject Teachers/HODs
Remind them that: Baseline data can give useful information about a pupil’s strengths and weaknesses which can assist teaching and learning “Predictions” are not a substitute for their professional judgement Reassure them that: It is not a “witch hunt” Value added data is used to assess pupil performance not teacher performance!

159 Pupils Make sure they know why they are taking the test
Make sure they take it seriously Make sure they don’t deliberately mess it up in order to lower their BSGs! Be prepared to look for clear anomalies and re-test if necessary Explain the chances graphs to them clearly

160 Parents Make sure they know why the pupils are taking the test
Explain the results to them Explain lots of times that the chances graphs and BSGs do NOT give personalised predictions Ensure that they receive good quality feedback from staff when ambers or reds are awarded Encourage them to ask lots of questions

161 YOUR QUESTIONS

162 (robert.clark@cem.dur.ac.uk) (neil.defty@cem.dur.ac.uk)
Thank You Robert Clark Neil Defty


Download ppt "Advanced Users – MidYIS, Yellis & ALIS"

Similar presentations


Ads by Google