Download presentation
Presentation is loading. Please wait.
Published byTheodora Casey Modified over 9 years ago
1
Improving Assessment, Improving Learning Ken Greer, Executive Director (Education), Fife Council CEM Conference, Glasgow 24 th March 2011 Implementing Building the Curriculum 5
2
How the world’s best-performing school systems come out on top “All of the top-performing and rapidly improving systems have curriculum standards which set clear and high expectations for what students should achieve. “High performance requires every child to succeed. “The only way to improve outcomes is to improve instruction. “All of the top-performing systems also recognise that they can not improve what they do not measure.” The McKinsey Report, September 2007
3
Assessment: It’s about asking questions 1.What’s assessment for? 2.What system are we working with? 3.How are we doing it? 4.How do we make sure we are all talking about the same standard? 5.Which unintended consequences do we want to avoid? 6.How do we put all this together and make it work in Fife (the unashamedly Chauvinistic).
4
1. What’s assessment for? – to support learning; – to give assurance to parents and others about learners’ progress; and – to provide a summary of what learners have achieved, including through qualifications and awards, and to inform future improvements. - Building the Curriculum 5
5
1. What’s assessment for? (2) The Assessment Reform Group (ARG) The use of assessment : to help build pupils’ understanding, within day-to-day lessons to provide information on pupils’ achievements to those on the outside of the pupil-teacher relationship: to parents (on the basis of in-class judgements by teachers and test and examination results) and to further and higher education institutions and employers (through test and examination results) data to hold individuals and institutions to account, including through the publication of results which encourage outsiders to make judgments on the quality of those being held to account.
6
2. What system are we working with? Building the Curriculum 3 (June 2008) Curriculum for Excellence: Es and Os CfE BtC 5 A framework for assessment: recognising achievement, profiling and reporting (December 2010) CfE BtC 5 A framework for assessment: understanding applying and sharing standards in assessment for CfE: quality assurance and moderation (October 2010) The NAR http://www.ltscotland.org.uk/nationalassessmentresource/ http://www.ltscotland.org.uk/nationalassessmentresource/ 51,000+ teachers (4000 in Fife)
7
Arrangements for Assessment Qualifications Self-evaluation and accountability Professional development to support the purposes of learning
8
3. How are we doing it? CfE aspirations delivered (SLCIRCEC) CPD; support; challenge; reporting Knowing the limitations of various approaches to assessment Measuring what we value Working together to moderate/define standards led by expert practitioners Monitoring progress, monitoring value-added Motivating: defining the bar Analysing; benchmarking; supporting Giving account and holding to account Finding a manageable way: economy, efficiency, effectiveness Milestones, not millstones
9
4. How do we make sure we are all talking about the same standard? Trust/professionalism The primacy of individual teachers’ judgements is at the heart of the assessment system in Scotland, supported by moderation at local authority level and across authorities A National Assessment Resource (NAR) to support teachers as they come to judgements about learners’ progress Outcomes and experiences, but not performance criteria Prior performance/other information? The car with no speedometer, no odometer and no petrol gauge
10
A view from another country “Progress also relies on the need to retain clear accountability through testing. This means at the end of primary school just as much as at the end of secondary.” Gordon Brown, quoted in TES, 30/10/2009 “In the less successful secondary schools, the limited use of assessment data on pupils on transfer to Year 7 led to insufficiently challenging targets for some pupils. “In raising the attainment of learners in literacy who are most at risk of not gaining the skills they need for successful lives, the factors identified from visits on this survey included sharp assessment of progress in order to determine the most appropriate programme or support.” Removing Barriers to Literacy, Ofsted 2011
11
5. Which unintended consequences do we want to avoid? What we want to avoid: assessment which does not support learning, directly or indirectly de-motivation of any learner self-fulfilling prophecies ‘high stakes’ testing league tables false comparisons What we want to promote: Improvements in learning, teaching and performance (SLCIRCEC)
12
6.What’s are we doing in Fife? Fife’s performance culture Appropriate information Strategy for improvement Collegiate approach Better understanding
13
The Fife way Importance of the………. right strategy right culture right information right interpretation right results i.e. positive impact on performance
14
Themes Concentration on Impact: relentless focus on outcomes Culture: Need to develop a strong performance culture Context: Need to understand underlying performance issues in an appropriate context Clarity: Need to use appropriate information to identify performance issues
15
Context What does national data tell us? More deprived More affluent
16
Challenging a deterministic view “…the PISA scores of the top-performing countries show a low correlation between outcomes and the home background of the individual student.” The McKinsey Report, September 2007 However, in Fife (Scotland?), social context has a strong relationship with attainment and other educational outcomes, including destinations Educational outcomes vary with social context across the social spectrum We need to understand this in order to address it.
17
Raising attainment … for all
18
Benchmarking Example: comparator authorities Authority 1 is a comparator authority for Authority 2 – rated as “very close” by HMIE In 2010, 9.4% of Authority 1 secondary pupils were FME, compared to 9.7% of Authority 2’s secondary pupils 2.2% of children in Authority 1 live in the SIMD 15% most deprived areas in Scotland, compared with 3.3% in Authority 2
19
Example: comparator authorities In 2010, 38% of Authority 1 pupils achieved 5+ at level 5 by the end of S4, as compared with 65% in Authority 2 What accounts for this difference?
20
Raising attainment … for all Performance management needs to account for the impact of social context and other relevant factors But … current school performance measures focus attention on the most deprived (e.g. FME, SIMD 15%) Need the best data to fit the issue not the best fit to the available data
21
Some of the best data … CEM assessments from the University of Durham (PIPS, INCAS, MidYIS, SOSCA) Standardised to a national level of performance and comparable across stages Provides a coherent view of performance at local authority, establishment, curriculum area & class levels
22
CEM assessment data PIPS P1 PIPS P3 PIPS P5 PIPS P7 SOSCA S2 SQA S4, S5, S6 Tracking Individual level Performance information School, curriculum area and class level
23
Looking across a cohort: Fife Most deprived decilesLeast deprived deciles
24
The same cohort at P7: Fife Most deprived decilesLeast deprived deciles
25
Continuous improvement … “Value added” measures At Fife level there is a strong correlation between performance in PIPS (at stage P7) and SQA (by S4) when viewed by social context. This is related to outcomes … More affluent More deprived
26
Continuous improvement … “Value added” measures The red lines separate levels of attainment most common amongst those who enter … HE FE, Employment Unemployment More affluent More deprived
27
Conclusion: outcomes and social context There is substantial evidence that educational outcomes vary across the social spectrum Current approaches to measuring school performance do not adequately reflect this relationship
28
Conclusion: local data sources Local sources of information (e.g. CEM data) can give valuable additional insight This can help to understand year-on-year variations in performance This can provide “added value” measures: – Across the social spectrum – Within a given cohort – By subject area
29
Conclusion: national data sources There is a lack of relevant data across the social spectrum E.g. there is no national data available on the SIMD profile of each education authority (EA) or school Relevant national data are vital for real understanding and improvement of: – EA and school performance – outcomes for young people
30
Conclusions Strong performance management can make a difference It requires the development of a performance culture It requires engagement by managers and leaders at all levels It needs to be based on the intelligent use of appropriate evidence
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.