Where are we with assessment and where are we going?

Slides:



Advertisements
Similar presentations
David M. Callejo Pérez & Sebastían R. Díaz West Virginia University Collecting, Organizing, and Documenting Student ProgressTeaching Again.
Advertisements

Some impressions from the school visits and the conference -No systematic report 1 st Some general wisdom 2 nd Key analysis questions of the project Conference.
Education in the information society Emerging trends and challenges for education Joke Voogt, University of Twente, Enschede, The Netherlands.
Performance Assessment
Ability-Based Education at Alverno College. Proposed Outcomes for Session 1. To introduce you to Alvernos approach to designing integrative general education.
Designing School Level Professional Development. Overview Assessing prior knowledge of professional development Defining professional development Designing.
Qualifications Update: Engineering Science Qualifications Update: Engineering Science.
Qualifications Update: National 5 Music Qualifications Update: National 5 Music.
Peer peer-assessment & peer- feedback
Lessons learned in assessment
School Based Assessment and Reporting Unit Curriculum Directorate
Training teachers to use the European Language Portfolio Former les enseignants à lutilisation du Porfolio européen des langues.
Wynne Harlen. What do you mean by assessment? Is there assessment when: 1. A teacher asks pupils questions to find out what ideas they have about a topic.
ESP410 Human Movement Pedagogy 3
Assessing student learning. Diagnostic - The gathering of information at the outset of a course or program of study to provide information for both the.
Alternative Assesment There is no single definition of ‘alternative assessment’ in the relevant literature. For some educators, alternative assessment.
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Workplace-based Assessment. Overview Types of assessment Assessment for learning Assessment of learning Purpose of WBA Benefits of WBA Miller’s Pyramid.
An overview of Assessment. Aim of the presentation Define and conceptualise assessment Consider the purposes of assessment Describe the key elements of.
An overview of Assessment. Aim of the presentation Define and conceptualise assessment Consider the purposes of assessment Describe the key elements of.
Constructing a test. Aims To consider issues of: Writing assessments Blueprinting.
Assessment of Clinical Competence in Health Professionals Education
Curriculum, Instruction, & Assessment
Principles of High Quality Assessment
INACOL National Standards for Quality Online Teaching, Version 2.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Assessment for teaching Presented at the Black Sea Conference, Batumi, September 12, Patrick Griffin Assessment Research Centre Melbourne Graduate.
LECTURER OF THE 2010 FIRST-YEAR STUDENT: How can the lecturer help? February 2010.
Session Materials  Wiki
Module 1 Introduction to SRL. Aims of the Masterclass Understand the principles of self regulated learning (SRL) and how they apply to GP training Develop.
Rediscovering Research: A Path to Standards Based Learning Authentic Learning that Motivates, Constructs Meaning, and Boosts Success.
Assessment Tools. Contents Overview Objectives What makes for good assessment? Assessment methods/Tools Conclusions.
Evaluation: A Challenging Component of Teaching Darshana Shah, PhD. PIES
Becoming a Teacher Ninth Edition
Evidence based research in education Cathy Gunn University of Auckland.
Authentic Assessment Principles & Methods
What should teachers do in order to maximize learning outcomes for their students?
Qualifications Update: Physical Education Qualifications Update: Physical Education.
Kazakhstan Centres of Excellence Teacher Education Programme Assessment of teachers at Level Two.
Verification: Quality Assurance in Assessment Verification is the main quality assurance process associated with assessment systems and practice - whether.
Qualifications Update: Environmental Science Qualifications Update: Environmental Science.
Thomas College Name Major Expected date of graduation address
© 2007 Thomson Delmar Learning. All Rights Reserved. Planning for Developmentally Appropriate Curriculum Chapter 3.
Learner Assessment Win May. What is Assessment? Process of gathering and discussing information from multiple sources to gain deep understanding of what.
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
MBBS, MPH, MCPS, MRCGP (UK), FRIPH (UK), FHAE (UK) TRAINEE EVALUATION METHOD Ass. Prof. Dr. Abdul Sattar KHAN Family & Community Medicine Department College.
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Universiteit Maastricht Barcelona, 6 – 9 July th Ottawa conference on Medical Education.
Performance-Based Assessment Authentic Assessment
1 Ideas of Problem-based Learning As a learner-centred process, problem- based learning meets the learners' interests and as such gives room for developing.
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Workshops to support the implementation of the new languages syllabuses in Years 7-10.
Assessment Tools.
New Advanced Higher Subject Implementation Events Statistics Unit Assessment at Advanced Higher.
Qualifications Update: Human Biology Qualifications Update: Human Biology.
Core Competencies for Adolescent Sexual and Reproductive Health Performance Assessment and Human Resources Toolkit.
Assessing Teacher Effectiveness Charlotte Danielson
Competency Based Medical Education Coming Soon to A University Near You! Dr. Janice Chisholm October 21, 2015.
Assessment Ice breaker. Ice breaker. My most favorite part of the course was …. My most favorite part of the course was …. Introduction Introduction How.
Competency based learning & performance Ola Badersten.
Workplace based assessment for the nMRCGP. nMRCGP Integrated assessment package comprising:  Applied knowledge test (AKT)  Clinical skills assessment.
Reliability in assessment Cees van der Vleuten Maastricht University Certificate Course on Assessment 6 May 2015.
School practice Dragica Trivic. FINDINGS AND RECOMMENDATIONS FROM TEMPUS MASTS CONFERENCE in Novi Sad Practice should be seen as an integral part of the.
1 Capstone design and curriculum renewal Margot McNeill Learning and Teaching Centre Thursday, 2 July 2009.
Copyright © 2005 Avicenna The Great Cultural InstituteAvicenna The Great Cultural Institute 1 Student Assessment.
Equity and Deeper Learning:
Understanding Standards: Nominee Training Event
Using Cognitive Science To Inform Instructional Design
Assessment 101 Zubair Amin MD MHPE.
Presentation transcript:

Where are we with assessment and where are we going? Cees van der Vleuten University of Maastricht This presentation can be found at: www.fdg.unimaas.nl/educ/cees/amee

Overview of presentation Where is education going? Where are we with assessment? Where are we going with assessment? Conclusions

Where is education going? School-based learning Discipline-based curricula (Systems) integrated curricula Problem-based curricula Outcome/competency-based curricula

Where is education going? Underlying educational principles: Continuous learning of, or practicing with, authentic tasks (in steps of complexity; with constant attention to transfer) Integration of cognitive, behavioural and affective skills Active, self-directed learning & in collaboration with others Fostering domain-independent skills, competencies (e.g. team work, communication, presentation, science orientation, leadership professional behaviour….).

Where is education going? Constructivism Cognitive psychology Underlying educational principles: Continuous learning of, or practicing with, authentic tasks (in steps of complexity; with constant attention to transfer) Integration of cognitive, behavioural and affective skills Active, self-directed learning & in collaboration with others Fostering domain-independent skills, competencies (e.g. team work, communication, presentation, science orientation, leadership professional behaviour….). Collaborative learning theory Cognitive load theory Empirical evidence

Where is education going? Work-based learning Practice, practice, practice…. Optimising learning by: More reflective practice More structure in the haphazard learning process More feedback, monitoring, guiding, reflection, role modelling Fostering of learning culture or climate Fostering of domain-independent skills (professional behaviour, team skills, etc).

Where is education going? Work-based learning Practice, practice, practice…. Optimising learning by: More reflective practice More structure in the haphazard learning process More feedback, monitoring, guiding, reflection, role modelling Fostering of learning culture or climate Fostering of domain-independent skills (professional behaviour, team skills, etc). Deliberate practice Emerging work-based learning theories Empirical evidence

Where is education going? Educational reform is on the agenda everywhere Education is professionalizing rapidly A lot of ‘educational technology’ is available How about assessment?

Overview of presentation Where is education going? Where are we with assessment? Where are we going with assessment? Conclusions

Expanding our toolbox….. Established technology of efficient written or computer-based high fidelity simulations (MCQ, Key Feature, Script Concordance Test, MEQs….) Does Shows how Knows how Knows how Knows Knows

Expanding our toolbox….. Established technology of structured high fidelity in vitro simulations requiring behavioural performance (OSCE, SP-based testing, OSPE….) Does Shows how Shows how Knows how Knows how Knows

Expanding our toolbox….. Emerging technology of appraising in vivo performance (Work-based assessment: Clinical work-sampling, Mini-CEX, Portfolio, practice visits, case orals….) Does Does Shows how Shows how Knows how Knows

Expanding our toolbox….. Emerging technology of appraising in vivo performance (self-, peer, co-assessment, portfolio, multisource feedback, learning process evaluations……) “Domain independent” skills Does Shows how Knows how Knows “Domain specific” skills

What have we learned? Competence is specific, not generic

Reliability as a function of testing time Case- Based Short Essay2 0.68 0.73 0.84 0.82 Mini CEX6 0.73 0.84 0.92 0.96 Practice Video Assess- ment7 0.62 0.76 0.93 In- cognito SPs8 0.61 0.76 0.92 0.93 Testing Time in Hours 1 2 4 8 MCQ1 0.62 0.76 0.93 PMP1 0.36 0.53 0.69 0.82 Oral Exam3 0.50 0.69 0.82 0.90 Long Case4 0.60 0.75 0.86 0.90 OSCE5 0.47 0.64 0.78 0.88 1Norcini et al., 1985 2Stalenhoef-Halling et al., 1990 3Swanson, 1987 4Wass et al., 2001 5Petrusa, 2002 6Norcini et al., 1999 7Ram et al., 1999 8Gorter, 2002

What have we learned? Competence is specific, not generic Any single point measure is flawed One measure is no measure No method is inherently superior Subjectivity/unstandardised conditions is not something to be afraid of.

What have we learned? Competence is specific, not generic One method can’t do it all

Magic expectations……. Direct observation methods, Portfolio Does Does Shows how Shows how Shows how OSCEs Shows how Knows how Knows how Key features (short cases) Knows how Knows Knows Knows

What have we learned? Competence is specific, not generic One method can’t do it all One measure is no measure We need a mixture of methods to cover the entire pyramid We can choose from a rich toolbox!

What have we learned? Competence is specific, not generic One method can’t do it all Assessment drives learning

Assessment and learning “The in-training assessment programme was perceived to be of benefit in making goals and objectives clear and in structuring training and learning. In addition, and not surprisingly, this study demonstrated that assessment fosters teaching and learning.….” (Govaerts et al, 2004, p. 774)

Assessment and learning “Feedback generally inconsistent with and lower than self-perceptions elicited negative emotions. They were often strong, pervasive and long-lasting….” (Sargeant et al., under editorial review)

Assessment and learning “You just try and cram - try and get as many of those facts into your head just that you can pass the exam and it involves… sadly it involves very little understanding because when they come to the test, when they come to the exam, they’re not testing your understanding of the concept. They test whether you can recall ten facts in this way? ” (Student quote from Cilliers et al., in preparation)

The continuous struggle Curriculum Assessment Content Format Programming/ scheduling Regulations Standards Examiners… Learner

What do we know? Competence is specific, not generic One method can’t do it all Assessment drives learning Verify the consequences Use the effect strategically Educational reforms are as good as the assessment allows it to be.

What do we know? Competence is specific, not generic One method can’t do it all Assessment drives learning Verify the consequences Use the effect strategically Educational reforms are as good as the assessment allows it to be.

Overview of presentation Where is education going? Where are we with assessment? Where are we going with assessment? Conclusions

My assumptions Innovation in education programmes can only be as successful as the assessment programme is Assessment should reinforce the direction of education that we are going Future directions should use our existing evidence on what matters in assessment.

The Big Challenge Established assessment technologies have been developed in the conventional psychometric tradition of standardisation, objectification & structuring Emerging technologies are in vivo and by nature less standardized, unstructured, noisy, heterogeneous, subjective Finding an assessment answer beyond the classic psychometric solutions is The Big Challenge for the future.

Design requirements future assessment Dealing with real-life: In vivo assessment cannot and should not be (fully) standardized, structured and objectified Includes quantitative AND qualitative information Professional and expert judgement play a central role.

Design requirements future assessment Dealing with learning: All assessment should be meaningful to learning, thus information rich Assessment should be connected to learning (framework of the curriculum and the assessment are identical) Assessment is ‘embedded’ in learning (equals the ‘in vivo of educational practice’ and adds significantly to the complexity).

Design requirements future assessment Dealing with sampling: Assessment is programmatic Comprehensive, includes domain-specific and domain independent skills Combines sampling across many information sources, methods, examiners/judges/ occasions….. Is planned, coordinated, implemented, evaluated, revised (just like a curriculum design).

Challenges we face Dealing with real life: How to use professional judgement? Do we understand judgment? How to elicit, structure and record qualitative information? How to use (flexible) standards? What strategies for sampling should we use? When is enough enough? How to demonstrate rigour? What (psychometric, statistical, qualitative) models are appropriate?

Challenges we face Dealing with learning: What are methodologies for embedding assessment (e.g. Wilson & Sloane, 2000)? How to deal with the confounding of the teaching and assessor role? How to combine formative and summative assessment? How to involve stakeholders? How to educate stakeholders?

Challenges we face Dealing with sampling at the programme level: What strategies are useful in designing a sampling plan or structure of an assessment programme? How to combine qualitative and quantitative information? How to use professional judgement in decision making on aggregated information? How to longitudinally monitor competence development? What are (new) strategies for demonstrating rigour in decision making? What formal models are helpful?

Contrasting views in approach Programmatic embedded assessment Conventional assessment Assessment separate from learning Assessment as part of learning Method-centred Programme-centred (based on overarching cohesive structure) Context free Context matters (dynamic relation between an ability, a task and a context in which the task occurs - Epstein & Hundert, 2002)

Contrasting views in approach Programmatic embedded assessment Conventional assessment Separation of formative and summative assessment Combined formative and summative assessment Traits (inferred dispositions) States (directly meaningful entities; situational) ‘Hard’ competencies ‘Hard’ & ‘soft’ competencies

Contrasting views in approach Programmatic embedded assessment Conventional assessment Standardized and structured Real life circumstances Decision driven (pass/fail) Feedback driven (what needs improvement) Reductionistic (ticking boxes, scoring, grading, qualifying) Information rich (including narrative, descriptive, qualitative information) Fixed standards Flexible standards

Contrasting views in approach Programmatic embedded assessment Conventional assessment Ownership lies with external administrative bodies Ownership lies with teachers and learners (within a master plan) Analytical scoring; restricted human judgement Holistic appraisal; relying on professional judgement (both at the individual situation level as well at the programme level)

Contrasting views in approach Programmatic embedded assessment Conventional assessment Point assessment Longitudinal, developmental, continuous Credit points in a database Thorough documentation of progress One method on skill Multimodal

Contrasting approaches in research Programmatic embedded assessment Conventional assessment Rigour defined in direct (statistical) outcome measures Rigour defined by evidence on thrustworthiness or credibility on the assessment process Reliability/validity Saturation of information, triangulation Benchmarking Accounting

Contrasting approaches in research Programmatic embedded assessment Conventional assessment Psychometric Edumetric/ educational Evidence to predict future performance Evidence of being exposed to the right training Controlled experimentation Naturalistic experimentation

Contrasting approaches in research Programmatic embedded assessment Conventional assessment Instrument improvement Instrument utility = reliability + validity System or programme improvement Instrument utility = depends on place and function in the assessment programme

Contrasting views in approach Programmatic embedded assessment Conventional assessment Confused

Overview of presentation Where is education going? Where are we with assessment? Where are we going with assessment? Conclusions

Conclusions Assessment has made tremendous progress Good assessment practices based on established technology are implemented widely Sharing of high quality assessment material has begun (IDEAL, UMAP, Dutch consortium)

Conclusions We are facing a major next step in assessment We have to deal with the real world The real world is not only the work-based setting but also the educational training setting

Conclusions To make that step: We need to think out of the box New methodologies to support assessment strategies New methodologies to validate the assessment

Conclusions There is a lot at stake: Educational reform depends on it I’m here because I couldn’t change the assessment

Conclusions Let’s join forces to make that next step!

This presentation can be found on: www.fdg.unimaas.nl/educ/cees/amee