Dr Ramesh Mehay TPD Bradford GP Training Scheme

Slides:



Advertisements
Similar presentations
Educational Supervision & Find Your Way Around in the E-portfolio Dr Jane Mamelok RCGP WPBA Clinical Lead.
Advertisements

The Research Consumer Evaluates Measurement Reliability and Validity
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
Introduction to the eportfolio and the nMRCGP HEKSS, KSS Deanery GP Specialty School 2013 Dr Susan Bodgener Associate Dean for Assessment.
General Practice Introduction to the eportfolio and the MRCGP KSS Deanery 2014 Dr Susan Bodgener Associate Dean for Assessment KSS Deanery.
Promoting Excellence in Family Medicine nMRCGP Workplace-based Assessment March 2007.
An overview of Assessment. Aim of the presentation Define and conceptualise assessment Consider the purposes of assessment Describe the key elements of.
Aligning Course Competencies using Text Analytics
6 Month ES Reviews Yer What???
Assessment. Aim: To explore the issues concerning assessment and apply the findings to a Microteach session.
Understanding Meaning and Importance of Competency Based Assessment
Modernising Medical Careers for GPs Education Supervision and Review of Progression.
Reliability vs. Validity.  Reliability  the consistency of your measurement, or the degree to which an instrument measures the same way each time it.
General Practice Introduction to the eportfolio and the MRCGP HEKSS 2015 Dr Susan Bodgener Associate Dean for Assessment, HEKSS.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Planning your three years Dr Morooj Mohammad GP/ Commissioning Fellow.
Workplace based assessment for the nMRCGP. nMRCGP Integrated assessment package comprising:  Applied knowledge test (AKT)  Clinical skills assessment.
Planning the next 3 years Dr Nisha Ehamparanathan GP / Clinical Skills Lecturer.
Grading based on student centred and transparent assessment of learning outcomes Tommi Haapaniemi
Observation System Kidderminster College January 2012.
Introduction to the eportfolio and the MRCGP HEEKSS 2015 Dr Susan Bodgener Associate Dean of Assessment, HEEKSS.
Assessment for Learning Centre for Academic Practice Enhancement, Middlesex University.
Bloom, Assessment & Aims and Objectives Module: ES204 Lecturers – Dr Justin Rami.
Train the Trainer Inland Navigation Simulators. Welcome Please tell us: Who you are Where your from What your experiences are as instructor Why you are.
WHY TEACHERS SHOULD CHECK FOR UNDERSTANDING Formative Assessment.
Teacher Work Sample. Lectures Objectives: 1.Define the teacher work sample. 2.Integrate lesson plans with a practice Teacher Work Sample in terms of the.
Language Assessment.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Cari-Ana, Alexis, Sean, Matt
Assessing Young Learners
What is feedback?. Giving feedback “ module 17” By: Rana Rihan Submitted to: Dr Suzan Arafat.
Training Trainers and Educators Unit 6 – Developing Aims and Learning Outcomes and Planning a Learning Session Aim To provide participants with the knowledge.
Feedback.
Classroom Assessment A Practical Guide for Educators by Craig A
Lesson planning 101 – Assessment & Questions
Formative and Summative Assessment
Assessment Theory and Models Part II
FACULTY OF PHARMACEUTICAL MEDICINE
Tests and Measurements: Reliability
The Future of Workplace Based Assessment
ASSESSMENT OF STUDENT LEARNING
Classroom Assessments Checklists, Rating Scales, and Rubrics
Clinical Assessment Dr. H
Level 4 Diploma in Dance Teaching
Assessment for learning التقييم من أجل التعلم
Designing and Using Rubrics
Evaluating the Quality of Student Achievement Objectives
Assessment A.E.T. Week 10 Cate Clegg.
Reliability.
Work Place Based Assessment
WHAT IS LIFE LONG LEARNING IMPORTANCE OF LIFE LONG LEARNING
TESTS AND EXAMS TCOP 2014 FACULTY OF ECONOMICS & FINANCE
Critically Evaluating an Assessment Task
Formative Assessment “a process used by teachers and students during instruction that provides feedback to adjust ongoing teaching and learning to improve.
Assessment PTLLS Week 9 Cate Clegg.
C.E.T Cate Clegg Assessment (1).
Training Trainers and Educators Unit 6 – Developing Aims and Learning Outcomes and Planning a Learning Session Aim To provide participants with the knowledge.
Changes in Training and Assessment
How can one measure intelligence?
Clinical Educator and Teacher Candidate Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Assessment of Classroom Learning
Lesson Planning (2) (A.E.T. Wk 11).
Assessment for Learning
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Module 1: Effective Use of Data & the Data Teams Model
SOME BASIC THINGS ABOUT GP TRAINING SCHEMES
Feedback.
Training Chapter for the Advanced Field Experience Form (Pre-CPAST
Cooperating Teacher and Student Teacher Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Looking at Student Work
Presentation transcript:

Dr Ramesh Mehay TPD Bradford GP Training Scheme Assessment Dr Ramesh Mehay TPD Bradford GP Training Scheme And its alignment

Exercise Useful questions to think about What is assessment? What do we want to achieve? How do we measure success? How do we know that the measure of success is the right one? In groups – 5 minutes Define assessment and think about different types of assessment YOU have undergone – write on flipchart. What we do know is that assessment leads to anxiety…

Definition Processes and instruments applied to measure learning achievements

Matching method of assessment to ILO Constructive alignment Why does the driving test have a theory and the practical? Isn’t the practical good enough? Can you link this with Bloom’s taxonomy? Constructive alignment: the selected method is clearly linked with the objective to be tested Example: Driving test - two components, theory and practical Content and method are matched to the desired outcomes – that an individual understands the Highway code, can recognise and respond to hazards and demonstrates practical driving skills at a pre determined level of competence

Miller’s Pyramid

CONSTRUCTIVELY ALIGNING A NEW TEACHING SESSION Designing a teaching session – the ACME way A is for setting Aims & Intended Learning Outcomes (ILOs) C is for defining Content M is for exploring Methodologies E is for the Ending (= Summarising, Evaluation, Assessment & future Learning) And it works like this: Define the Aims and ILOs (A). This is the most crucial step. Then: Use the Aims & ILOs (A) to define the Content (C) Use the Aims & ILOs (A) to select the Methodology (M) Use the Aims & ILOs (A) to construct the Evaluative Assessment (E).

Assessments need to be reliable and valid. Reliable? (competence level) Valid? (Bloom, Miller) Inverse relationship. Reliability (the consistency with which a test will produce similar results) How do we know a test-score is reliable? Test-retest Split half reliability (correlation between scores in different components of the test) Measures of inter-rater reliability Measured of intra-rater reliability Standard setting (EBEL and ANGOFF) Validity - Does the test measure what we want it to measure? For example, written tests assess knowledge and reasoning but not skills Content validity– eg integrated summative exams in MBChB Construct validity – measures things that are proxy for a construct that cannot be directly observed – eg attitudes Face validity – measures what it is supposed to measure

Competency level

Exercise – MRCGP components Reliability vs Validity.

The Assessment Drives the Learning For many students, assessment is not an educational experience in itself, but a process of ‘guessing what the teacher wants’. McLaughlin & Simpson (2004) An assessment can be a powerful way of making a trainee learn something – but do they engage in it with their hearts? And doesn’t it just cause unnecessary anxiety which can mask true performance? The BIG question is how do we drive deep learning and not just teach to the test? Is CPD, performance monitoring and feedback the way to go?

FORMATIVE & SUMMATIVE ASSESSMENT

Summative Assessment Formative Assessment Single process like an exam. Continuous process that occurs throughout the training period. Giving the learner a ‘ticket’ or certificate to proceed. Is about helping the learner develop. Results in a grade of competence. Identifies strengths and weaknesses. Results in a certificate or grade of competence. Results in developmental feedback. Often quite cold, remote and detached from teaching. Results in a Personal Development Plan. Is often seen as a negative process e.g. ‘Thank goodness the AKT is over.’ Is seen as a positive process because it’s all about helping the trainee, not telling them off. Examples in GP training: AKT, CSA, WPBA (COTs, mini-CEXs, CBDs etc) Examples in GP training CS metings, ES meetings, PCA, RCA, Appraisal Summative assessment is an assessment OF learning. Formative assessment is an assessment FOR learning. In summative assessment, the trainee tries to hide what they are bad at. In formative assessment, they should be comfortable about displaying it.