Wynne HARLEN Susana BORDA CARULLA Fibonacci European Training Session 5, March 21 st to 23 rd, 2012.

Slides:



Advertisements
Similar presentations
This project has been funded with support from the European Commission. This publication reflects the views only of the author, and.
Advertisements

Primary Strategy Subject Leader Briefing June/July 2008 Leading on learning – making best use of Assessment for learning.
Primary Strategy Literacy Subject Leader Meeting Session 4 March 2009 Leading on learning – making best use of Assessment for learning in Literacy.
School Based Assessment and Reporting Unit Curriculum Directorate
Assessment Adapted from text Effective Teaching Methods Research-Based Practices by Gary D. Borich and How to Differentiate Instruction in Mixed Ability.
K-6 Science and Technology Consistent teaching – Assessing K-6 Science and Technology © 2006 Curriculum K-12 Directorate, NSW Department of Education and.
Wynne Harlen. What do you mean by assessment? Is there assessment when: 1. A teacher asks pupils questions to find out what ideas they have about a topic.
Department of Mathematics and Science
1 SESSION 3 FORMAL ASSESSMENT TASKS CAT and IT ASSESSMENT TOOLS.
A2 Unit 4A Geography fieldwork investigation Candidates taking Unit 4A have, in section A, the opportunity to extend an area of the subject content into.
Introduction to Research Methodology
The Foundation Stage Assessment for Learning. Programme Session oneIntroduction Rationale for AfL COFFEE Session twoSharing learning intentions Success.
Aligning Assessments to Ohio Social Studies Standards Teaching American History Grant Presentation.
Dr. Pratibha Gupta Associate professor Deptt. of Community Medicine ELMC & H, Lucknow.
RESEARCH METHODS IN EDUCATIONAL PSYCHOLOGY
Wynne Harlen Fibonacci European Training Session, March 21 st 2012.
Grade 12 Subject Specific Ministry Training Sessions
Standards and Guidelines for Quality Assurance in the European
Science Inquiry Minds-on Hands-on.
Please help yourself to a drink. We will start at 9.15a.m.
Performance Review and Staff Development (PRSD) The Role of Governors Governor Reviewer Training.
1 DEVELOPING ASSESSMENT TOOLS FOR ESL Liz Davidson & Nadia Casarotto CMM General Studies and Further Education.
New Advanced Higher Subject Implementation Events
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Designing Learning.
PGCE Full-time SE3 Briefing March Aims Be aware of the expectations of SE3 Understand what is expected of you during the block experience Understand.
Introduction to Primary Science APP. What do the AFs look like? AF1 – Thinking Scientifically AF2- Understanding the applications & implications of science.
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
“We will lead the nation in improving student achievement.” CLASS Keys TM Module 1: Content and Structure Spring 2010 Teacher and Leader Quality Education.
Talk to Us Approach to Evaluation Lindsay Wager 1 st April 2014.
Curriculum and Assessment in Northern Ireland
CLASS Keys Orientation Douglas County School System August /17/20151.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Monitoring through Walk-Throughs Participants are expected to purpose the book: The Three-Minute Classroom Walk-Through: Changing School Supervisory.
EDU 385 Education Assessment in the Classroom
INVESTIGATIVE SCIENCE LEARNING ENVIRONMENT (ISLE) Eugenia Etkina Alan Van Heuvelen Rutgers University Moscow, Idaho, April 1st, 2011
Understanding Meaning and Importance of Competency Based Assessment
Qualifications Update: Care Qualifications Update: Care.
Qualifications Update: Childcare and Development Qualifications Update: Childcare and Development.
Second session of the NEPBE I in cycle Dirección de Educación Secundaria February 22, 2013.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Middle Leadership Programme Day 1: The Effective Middle Leader.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Tier III Implementation. Define the Problem  In general - Identify initial concern General description of problem Prioritize and select target behavior.
An Introduction to Formative Assessment as a useful support for teaching and learning.
W W W. C E S. C L E M S O N. E D U / G E / Planning Engineering Education Research Facilitator: Matthew W. Ohland.
Evaluating Multilingual Education Programs Seminar on Multilingual Education Kabul, March 2010 Dennis Malone, Ph.D.
National Standards in Reading & Writing Sources : NZ Ministry of Education websites. G Thomas, J Turner.
Tracking and Target Setting. Ensuring impact on pupil learning  Identify strand from an area of learning in literacy and mathematics  Identify focus.
Session 4 Planning for Implementation of APP. 1 PROCESS Familiarisation with AFs & Standard files Practice in levelling Standard files using APP guidelines.
European Training Session 5: “Deepening the Understanding of Inquiry in Natural Sciences” Conclusions and closure Fibonacci Topic Group 2.
© Crown copyright 2008 Subject Leaders’ Development Meeting Spring 2009.
GCSE CHILD DEVELOPMENT. Summary of Assessment Unit 1 Written Paper 1½ hours (40% final mark, one tier only) Unit 2 Controlled Assessment – Child Study.
Parent Workshop Year 2 Assessment without levels January 2016.
Wynne Harlen Susana Borda Carulla Fibonacci European Training Session, March 23 st 2012.
European Training Session 5: “Deepening the Understanding of Inquiry in Natural Sciences” Preparing to Observe IBSE Using the Fibonacci IBSE Diagnostic.
European Training Session 5: “Deepening the Understanding of Inquiry in Natural Sciences” Analysing the class visits: inquiry at the primary school Using.
Science Notebooks Research-Based Strategies on how to implement them in today's science classroom by Karen Shepherd.
National PE Cycle of Analysis. Fitness Assessment + Gathering Data Why do we need to asses our fitness levels?? * Strengths + Weeknesses -> Develop Performance.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Assessment for Learning in Kindergarten
Implementing the Professional Growth Process Session 3 Observing Teaching and Professional Conversations American International School-Riyadh Saturday,
MATHEMATICS KLA Years 1 to 10 Planning MATHEMATICS Years 1 to 10.
Testing & Measurement  Criterion-Referenced Tests.  Pre-tests  Post-tests  Norm-Referenced Tests.
EVALUATING A MIDDLE SCHOOL MATH M.ED. PROFESSIONAL DEVELOPMENT PROGRAM. Knowledge, Pedagogy, Practice or Student Achievement:
Action Research for School Leaders by Dr. Paul A. Rodríguez.
IB Environmental Systems and Societies
CSEC Physics Workshop- SBA
Introduction to CPD Quality Assurance
Presentation transcript:

Wynne HARLEN Susana BORDA CARULLA Fibonacci European Training Session 5, March 21 st to 23 rd, 2012

Agenda What is the IBSE Diagnostic Tool for CPD providers? Why an IBSE Diagnostic Tool? The content of the tool How and when to use it Taking action: how to help?

What is the IBSE Diagnostic Tool for CPD Providers? List of criteria for judging the implementation of IBSE Through observation and analysis of classroom practices Specific interactions that indicate IBSE DOES NOT cover all aspects of good practice, only those SPECIFIC to IBSE K-9 (children ages 5-13)

Why an IBSE Diagnostic Tool? DIAGNOSE CPD requirements, NOT score teachers Identify strengths and weaknesses in IBSE implementation Provide feedback to CPD providers concerning teachers’ training requirements when designing a CPD programme during or after the implementation of a CPD programme (pre-test post-test) Help DEFINE what we mean by IBSE in terms of teaching and learning practices

The content of the tool: instructions on planning and coordinating an evaluation Deciding on the objective of the evaluation Selecting and training the observers Planning the class visits Gathering the data Analyzing the data and taking action

The content of the tool: forms for data collection LINES Interview with the teacher The observer The session The class The teacher The topic and objective of the session Section A: Teacher-Pupil Interactions Building on pupils’ ideas Supporting pupils’ own investigations Guiding analysis and conclusions Section B: Pupil Activities Carrying out investigations Working with others Section C: Pupils’ Records Any records pupils make of their work Written record COLUMNS (sections A, B, C) : Explanations and examples for each item Making a judgement (Yes, No, NA) Complementary information

Using the tool: planning an evaluation Read the instructions carefully! Decide on the objectives of your evaluation Diagnosis or assessment of impact? Are pre-requisites for inquiry in place? Are basic material needs covered? (materials, books, classroom space) Select and train the observers Your observer’s profile How will you interpret each item within your IBSE programme? Training observers as a formative process Plan the class visits Decide on appropriate time of school year for observations Decide on number of consecutive sessions to observe: at least 2 sessions ideally a full sequence

Using the tool: gathering the data Explain the purpose of your visit to the teacher: NOT to score them but to identify training needs Before or after the session: interview with the teacher, record data directly on the form During the session: take notes on a separate sheet of paper, record data on the form only AFTER the session Record data on the form quickly after your observation

Using the tool: recording the data on the form (Sections A, B, C) 2. Provide qualitative evidence to support all NO and NA judgements 1. Make your judgement for each item: YES means the practice: occurred AND is relevant in the context of the observation NO means the practice: did not occur at all or only rarely AND is relevant in the context of the observation NA means the practice did not occur AND is not relevant in the context of the observation (Content of session? Age of pupils?)

Using the tool: examples for NA judgements Items 4e – 4i on the execution of an experimental design: NA for a session where pupils did not actually carry out the experiment because it will happen in the next session Items 7ª - 7e on pupils’ written records: may be NA for kindergartners

Analyzing the data Identifying high “No” and “NA” frequencies for each item High “No” frequency: need for attention within the CPD programme High “NA” frequency: are teachers not creating the necessary opportunities for this aspect of IBSE to occur? Need support in building teaching plans? Qualitative data is the key to interpreting quantitative data

Analyzing the data Grouping items for analysis Group items according to: The priorities of your CPD programme Actions already undertaken in your CPD programme i.e.: group items 1a, 2a, 2b to focus on appropriate use of questions by teacher and pupils Comparing teaching and learning practices Identify corresponding items in sections A and B (i.e.: items 2e and 4d on “fair testing”) Do “No” judgements correspond? If not, what does this mean?

What’s next? Becoming familiarized with the items: “Preparing to observe IBSE” Observing a class session to recover “real” data Analyzing class visits: what can we do with this data? What are the specificities of IBSE in different levels of schooling? Should the tool be used in the same manner in all levels? This is still a work in progress!

Any questions? Please focus on questions concerning the use or the general structure of the tool. Questions on specific items and examples will be dealt with in the following workshop. THANK YOU!