NAPLAN Workshop Assessment for Better Learning using NAPLAN Data Presented by Peter Congdon, Principal Consultant – Kmetrics On behalf of the VCAA.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

The Journey – Improving Writing Through Formative Assessment Presented By: Sarah McManus, Section Chief, Testing Policy & Operations Phyllis Blue, Middle.
School Based Assessment and Reporting Unit Curriculum Directorate
PORTFOLIO.
Using NAPLAN summative data to identify instructional successes and challenges. Presented by Philip Holmes-Smith School Research Evaluation and Measurement.
The Marzano School Leadership Evaluation Model Webinar for Washington State Teacher/Principal Evaluation Project.
Standards, data and assessment. Links to Tfel 1.6 Design, plan and organise for teaching and learning 2.4 Support and challenge students to achieve high.
Making Sense of Assessments in HE Modules (Demystifying Module Specification) Jan Anderson University Teaching Fellow L&T Coordinator SSSL
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Grade 12 Subject Specific Ministry Training Sessions
Teachers have a significant role in developing and implementing the most effective teaching and learning strategies in their classroom and striving for.
Professional Growth= Teacher Growth
National Partnerships Primary Mathematics Specialists Initiative
Using the T-9 Net This resource describes how schools use the T-9 Net to monitor the literacy and numeracy skills of students in Transition, Year 1 and.
Performance and Development Process What to take from 2014/15 Improved understanding of the guidelines Reflective Teacher Practice Genuine and meaningful.
Deliberate Practice Technical Assistance Day
Principles of Assessment
ASSESSMENT FOR BETTER LEARNING USING NAPLAN DATA Presented by Philip Holmes-Smith School Research Evaluation and Measurement Services.
Horizon Middle School June 2013 Balanced Scorecard In a safe, collaborative environment we provide educational opportunities that empower all students.
Margaret J. Cox King’s College London
Overall Teacher Judgements
© 2008 by PACT PACT Scorer Training Pilot.
1 Making sound teacher judgments and moderating them Moderation for Primary Teachers Owhata School Staff meeting 26 September 2011.
A good place to start !. Our aim is to develop in students ; Interest in & enjoyment of historical study; Skills for life long learning; The capacity.
“We will lead the nation in improving student achievement.” CLASS Keys TM Module 7: Formal Observation Spring 2010 Teacher and Leader Quality Education.
Prepared and presented by Reda Saad El-Mahdy Ahmed Bin Hanbal Independent Secondary School for Boys And “SEC Curriculum Standards”
Assessment Practices That Lead to Student Learning Core Academy, Summer 2012.
The Principles of Learning and Teaching P-12 Training Program
PERSONNEL EVALUATION SYSTEMS How We Help Our Staff Become More Effective Margie Simineo – June, 2010.
Geelong High School Performance Development & Review Process in 2014.
EPL 3 – Week 3 Professional Knowledge Domain Knowing learners & learning.
FEBRUARY KNOWLEDGE BUILDING  Time for Learning – design schedules and practices that ensure engagement in meaningful learning  Focused Instruction.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
EPotential ICT Capabilities Resource. The ePotential ICT Capabilities Resource (ePotential) is designed to: Assist teachers to develop their own ICT Professional.
Middle Leadership Programme Day 1: The Effective Middle Leader.
Michigan School Improvement Conversation Starter Kit II.
FLAGSHIP STRATEGY 1 STUDENT LEARNING. Student Learning: A New Approach Victorian Essential Learning Standards Curriculum Planning Guidelines Principles.
Strengthening Student Outcomes in Small Schools There’s been enough research done to know what to do – now we have to start doing it! Douglas Reeves.
Attainment Peter Gorrie, QIO September 2014.
Primary.  There was a greater level of improvement in Literacy than Numeracy for both FSME and Non-FSME pupils.  Boys showed a greater level of.
Data Analysis Processes: Cause and Effect Linking Data Analysis Processes to Teacher Evaluation Name of School.
Improving Student Achievement Three of the most effective strategies that have been found to have most success are: Sharing learning intentions with students.
ASSESSMENT FOR LEARNING DIAGNOSTIC ASSESSMENT - SONIYA JAYARAMAN.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Becky Pearson and Joyce Gardner.
Office of Service Quality
Assessment Information Evening 17 th September 2015.
Agenda  What is On Demand Testing?  Types of Tests Available  Uses and Benefits of On Demand Testing  Progress Tests  Linear Test Reports (Progress.
Department of Education Professional Learning Institute GOOD TEACHING Differentiated Classroom Practice Learning for All Module 1.
End of Key Stage One Assessment Evening February 2016.
Early Childhood. 2 Introductions 3 Norms Be actively engaged. Be actively engaged. Share your best thinking. Share your best thinking. Listen actively.
Defining & Aligning Local Curriculum. What is Curriculum? Individually consider your personal definition of the term curriculum What words do you think.
[School Name]’s Student Perception Survey Results This presentation is a template and should be customized to reflect the needs and context of your school.
KS1 SATS Guidance for Parents
White Pages Team Grey Pages Facilitator Team & Facilitator Guide for School-wide Reading Leadership Team Meetings Elementary.
Good Morning and welcome. Thank you for attending this meeting to discuss assessment of learning, pupil progress and end of year school reports.
Introducing Victorian Curriculum - Towards Foundation Levels A to D.
Professional Development: Imagine Difference Shapes and Sizes
Information for Parents Key Stage 3 Statutory Assessment Arrangements
Professional Review Process for Heads / Principals
Using Victorian Curriculum to plan F-6 Music learning
BUMP IT UP STRATEGY in NSW Public Schools
Iowa Teaching Standards & Criteria
Numeracy Skills Framework
Introducing Victorian Curriculum - Towards Foundation Levels A to D
Analysing your pat data
Unit 7: Instructional Communication and Technology
Planning Dance 7-10 learning using Victorian Curriculum
Presentation transcript:

NAPLAN Workshop Assessment for Better Learning using NAPLAN Data Presented by Peter Congdon, Principal Consultant – Kmetrics On behalf of the VCAA

Workshop structure Main themes How curriculum leaders and classroom teachers can use their school-level data to analyse the impact of their school’s learning programs. How classroom teachers can use the responses to questions on the NAPLAN 2014 tests as a diagnostic tool to inform future teaching.

Workshop Content NAPLAN Data Service reports and functions Methods of using the data and results for monitoring and improvement purposes Working with data, interpreting data, describing data and developing an informed response to the data.

Context – Professional practice Using assessment data effectively has become embedded in teaching expectations and school improvement processes. Use of data: National Professional Standards for Teachers – Australian Institute for Teaching and School Leadership Standard 5 - Assess, provide feedback and report on student learning

Context - To use student data to improve teaching practice. Teachers need to be able to do the following: Find the relevant pieces of data in the data system or display available to them (data location) Understand what the data signify (data comprehension) Figure out what the data mean (data interpretation) – Substantive and contextual Select an instructional approach that addresses the situation identified through the data (instructional decision making) Frame instructionally relevant questions that can be addressed by the data in the system (question posing) Teachers' Ability to Use Data to Inform Instruction: Challenges and Supports. U.S. Department of Education Office of Planning, Evaluation and Policy Development

Helpdesk phone AIM results (2007 and earlier) are no longer available

Reference documents Assessment materials Test performance & content summary guides Reporting guides Descriptive exemplars of marking guides Analysis strategies Online tutorial assistance for reports can be accessed at All available within the NAPLAN Data Service to support use of the results

Box and Whisker Charts Reference groups NationalState Focus group School

Normal Distribution 90 th percentile 10 th percentile 25 th percentile 50 th percentile 75 th percentile Number of students if total =

Skewed Distribution 90 th percentile 10 th percentile 25 th percentile 50 th percentile 75 th percentile Number of students per

NAPLAN Reporting Bands

Example NAPLAN Summary Year 7 What are the main features of these results? Strongest in Writing and Spelling Lower students not as low as State’s low students Higher students not as high as State’s higher students in Reading, G&P and Numeracy Is this a reflection of the school’s teaching program, and/or a feature of this cohort? How much of these differences are due to imprecision?

NAPLAN Year 7 Summary

Year 7 results Usually, Year 7’s have only been at your school for a few months prior to testing. Results can reflect feeder school programs. Consider grouping students by main feeder schools and sharing results – network.

School Summary Exercise - 5 mins Review your School summary report(s) Address the following – Strongest in; – Lower students compared to State’s low students; – Higher students compared to State’s higher students; – Major influence on results include; – Strategies to consider;

Trend Data Find evidence of the impact of change over five years – Shows the range of student achievement levels, Box and Whiskers – Plots the mean student achievement level;

Main sources of variability: Different students Work ethic Behaviour Home support Measurement imprecision Test properties Equating Group size School Leadership Resources Teacher Effectiveness Program Content Alignment

Five Year Trend Exercise – 5 mins Review your Five Year Trend report(s) Address the following Compare performance relative to State group – High = Top 25% v State Top 25%, – Medium = Middle 50% v State Middle 50% – Low = Bottom 25% v State Bottom 25% Reading Writing Spelling Grammar & Punctuation Numeracy Identify Influencers on results – Cohort – School – Teacher – Programs Main sources of variability: Different students Work ethic Behaviour Home support Measurement imprecision Test properties Equating Group size School Leadership Resources Teacher Effectiveness Program Content Alignment Choose one

Group Summary Report How do our groups stack up against the State groups? What does this tell us about; Cohort, School, Teachers, Programs.

Group Summary Exercise – 5 minutes Review your Group summary report(s) Address the following Compare the performance of each group relative to State group in one domain – High = Top 25% v State Top 25%, – Medium = Middle 50% v State Middle 50% – Low = Bottom 25% v State Bottom 25% Girls V State Girls Boys V State Boys LBOTE V State LBOTE ATSI V State ATSI Influencers on results – Cohort – School – Teacher – Programs Main sources of variability: Different students Work ethic Behaviour Home support Measurement imprecision Test properties Equating Group size School Leadership Resources Teacher Effectiveness Program Content Alignment

Assessment Area Report Percentage of items answered correctly in short answer questions Number of items Raw score average, State (36 60% correct) = 36*.60 => 21.6 Raw score average, School (36 52% correct) = 36*.52 => 18.7

Assessment Area Exercise – 2 minutes Identify if any dimensions have been flagged as significantly different from the State Calculate or estimate the Raw score difference between your students and the State on one or more dimensions

Writing Criteria Report

Writing Criteria Exercise – 2 mins. Compare modal scores. Modal score = most common score Which criterion are you relatively strongest on? SchoolState

Item Analysis Report Finding skills of relative strength or weakness – Graphical format Understanding student weaknesses – Table format How classroom teachers can use the responses to questions on the NAPLAN 2014 tests as a diagnostic tool to inform future teaching.

Test and group details Link to some test details Link to test items Item Analysis Report

Item Analysis Report – Graph Finding skills of relative strength or weakness Harder than State = Easier than State =

Item Analysis Understanding student weaknesses

Item Analysis Understanding student weaknesses 50 is half of 100, 22 is half of = 94 AD Half of the group could do it – B Remainder had problems. Were they related to Format, language, concept, process, knowledge, skill,...?

One quarter of the group could do it – option A Remainder had problems. Were they related to Format, language, concept, process, knowledge, skill, opportunity to learn,...?

Student Responses - Individuals

Peter Congdon - Kmetrics Item level Diagnostics By comparing your students’ success at the item level, to that of all other students in the state, you can.. Look for relative differences as the test progresses – Did our students answer all the items? – Were they consistently above, below or similar to the rest of the state? What do any difference represent? Put it into context Is language or vocabulary an issue? Is test taking an issue – format, motivation, terminology? – Are there items that represent areas of the curriculum not yet introduced? How does this compare to the rest of the state? – Are there areas taught, but not as well as expected? – Consider what is happening at other Year levels – curriculum mapping. – Where are the relative strengths – how can you learn from them?

Item Analysis Exercise – 2 minutes Identify items indicating relative strengths and weaknesses Follow up (for class room teachers) Understand and capitalise on strengths Investigate relative weaknesses and develop a plan in response Discuss and share with colleagues

Zone of Proximal Development Item Number Answer Key ############################## Dimension SSSSSSSSSSSSSSSSSSSSSSSSSSSSSS raw score ANDERSEN, HANAPHI ######√#####################√√ 3 BIDDELL, RILEY######√###########√√###√##√### 5 BRENTON, HAMISH-#--########-#-√###-##--#√√√√# 5 BEATTY, BENJAMIN#-##√#####√#-#--#######-#-√√√√ 6 BLAZEY, MAXWELL##########√√#############√√#√√ 6 BELL, MICHAEL-###########√######√√√√##√-√√√ 9 BRADSHAW, REBECCA ###################√√√√√#√√√#√ 9 BUCKLER, ROCKY#######√#########√√#√###√√√√√√ 10 ANDERSON, EMILY###########√√√#√#####√#√√√√√#√ 11 ANGUERRE, CHARLES ########√##√#######√##√√√√√√√√ 11 AL MALIKI, MITCHELL ####√√####√#√###√#√#√##√#√√√√√ 13 CAITHNESS, JOSHUA -#####-√√√-√#---√√##√√-√√√-√#√ 13 BENATSKY, GEORGIA ########√√√√√#√##√#√##√√√√√√√√ 16 BOTHAM, NATHAN#####√√#√√#√√#√#√√√√√√√√√√√√√√ 21 BALLA, ZANE#####√√√√√√#√√√√#√√√√√√√√√√#√√ 22 BEAUPEURT, MARK#√√√√√√√√√#√√√√√√√√√√√√√√√√√#√ 27 BEAVIS, JORDAN√√√√√√√√√√√√√√√√√√√√√√√√√√√√√√ 30 Zone of Proximal Development: What students are capable of learning with the guidance and support from teachers and peers. What students can already do independently. Level of potential after other steps have been made. Vygotsky and other educational professionals believed education's role was to give children experiences that were within their zones of proximal development, thereby encouraging and advancing their individual learning

Substantive descriptions of achievement levels Using the NAPLAN items to identify skills, knowledge, procedures... National bands – parent reports skill descriptors, Item Number Answer Key ############################## Dimension SSSSSSSSSSSSSSSSSSSSSSSSSSSSSS raw score ANDERSEN, HANAPHI ######√#####################√√ 3 BIDDELL, RILEY######√###########√√###√##√### 5 BRENTON, HAMISH-#--########-#-√###-##--#√√√√# 5 BEATTY, BENJAMIN#-##√#####√#-#--#######-#-√√√√ 6 BLAZEY, MAXWELL##########√√#############√√#√√ 6 BELL, MICHAEL-###########√######√√√√##√-√√√ 9 BRADSHAW, REBECCA ###################√√√√√#√√√#√ 9 BUCKLER, ROCKY#######√#########√√#√###√√√√√√ 10 ANDERSON, EMILY###########√√√#√#####√#√√√√√#√ 11 ANGUERRE, CHARLES ########√##√#######√##√√√√√√√√ 11 AL MALIKI, MITCHELL ####√√####√#√###√#√#√##√#√√√√√ 13 CAITHNESS, JOSHUA -#####-√√√-√#---√√##√√-√√√-√#√ 13 BENATSKY, GEORGIA ########√√√√√#√##√#√##√√√√√√√√ 16 BOTHAM, NATHAN#####√√#√√#√√#√#√√√√√√√√√√√√√√ 21 BALLA, ZANE#####√√√√√√#√√√√#√√√√√√√√√√#√√ 22 BEAUPEURT, MARK#√√√√√√√√√#√√√√√√√√√√√√√√√√√#√ 27 BEAVIS, JORDAN√√√√√√√√√√√√√√√√√√√√√√√√√√√√√√ 30

Easier Harder Describing Zone of Proximal Development – for most raw scores of 5-6 drafting blizzard tertels Seriusly orkwardly

Relative Growth How is relative growth defined? Each student’s level of relative growth is determined by comparing their current year NAPLAN result to the results of the group of all ‘similar’ Victorian students. ‘Similar’ students are defined as those that had the same NAPLAN score two years ago. Compared to these similar students, if a student’s current NAPLAN score is in the : – highest 25%, their growth level is categorised as ‘High’. (Green) – middle 50%, their growth level is categorised as ‘Medium’ (Yellow), and – lowest 25%, their growth level is categorised as ‘Low’ (Red). Note that the percentages within each category will vary from school to school.

Relative Growth

Relative Growth Exercise – 2 minutes Identify domains with low relative growth greater than 25% And or Identify domains with high relative growth greater than 25% Was the relative growth even across the starting Bands Follow up (Co-ordinators and class room teachers) implementation and effectiveness of differentiation

Working with NAPLAN Data Principal Analyse Summary and Trend results Conduct program evaluations Facilitate staff access to work with results Co-ordinator Map the relatively high and low performances against the delivery of the curriculum Look for class/group differences Work with colleagues. Class room teacher Diagnose misconceptions, weaknesses & strengths. Develop teaching plans in response.

Reporting Back Summative position Overall location of the students and subgroups – Shape of the distributions Are students being left behind? How spread out are they? Is there too much focus on the low achieving students - top students held back? – How would describe the location of your students against the state? – How spread out are your students in the different dimensions? Trends – How have your results changed over time? Consider both the location and the shape of the distribution. – Are there identifiable factors that may be contributing? Student level effects: motivation, engagement, home Teacher level effects: method, style, experience, support, workload School level effects: leadership, resources, programs Growth, – Are students maintaining their position relative to the state? Is the top growing as fast as the bottom relative to the state? Is growth even across the dimensions? Curriculum Mapping – Item level Diagnostics Can you find relative strengths and weaknesses? What can you and the school adjust based on these findings? Link to curriculum scope and sequence Link to programs and pedagogy Peter Congdon - Kmetrics

Thank you for attending Please use the rest of this time to go over your results, and clarify your interpretations. Peter Congdon Principal Consultant Mobile: web: PowerPoint presentation available at Further help is available by contacting me directly