Download presentation
Presentation is loading. Please wait.
1
An overview of Assessment
2
Aim of the presentation Define and conceptualise assessment Consider the purposes of assessment Describe the key elements of a good assessment Describe two main types of assessment
3
Definitions of assessment ‘… the process of documenting, often times in measurable terms, knowledge, skills, attitudes and beliefs.’ ‘The classification of someone or something with respect to the worth’
4
Other words that can mean “assessment” Appraisal Categorisation Evaluation Judgement and value judgement Adjudication Estimation
5
In medical educational terms we can think of assessment as the process by which knowledge, skills and behaviours may be tested and judgments made about competence or performance.
6
Purposes of Assessment
7
Measuring competence Diagnosing a student’s/trainee’s problems Measuring improvement Showing the effectiveness of the curriculum/quality assurance Introducing curriculum change Identifying effective teaching Self-evaluation Ranking Motivation for teachers and learners Progress testing (developmental measures of improvement) Deriving income
8
The principles of assessment
9
What should be assessed?
10
The curriculum and assessment The curriculum & the outcomes should define assessments The ideal assessment fits the curriculum
11
In real life there is often a less than perfect match The objectives are not fully and transparently defined Students will define their own “hidden curriculum” and may miss elements that are tested
12
Assessing all elements of the curriculum
13
A simple model Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S67. Knows Shows how Knows how Does Cognition Behaviour
14
Knows Shows how Knows how Does The assessment pyramid Knowledge: Written tests Competence: OSCEs Performance: Workplace
15
Knows Shows how Knows how Does The emerging assessment pyramid Meta-cognition
16
Constructing a “good” test
17
U= R x V x E x C x A U= utility R= reliability V= validity E= educational impact C= costs A=acceptability van der Vleuten (1996) The assessment of professional competence:developments, research and practical implications. Adv Health Sciences Education. 1 (1) 41-67
18
Reliability: If this test were administered again would the results be the same? REPEATABILITY/REPRODUCABILITY –Would the assessors make the same judgements? –Would a different set of questions result in a significantly different score? –Were there other things that may have influenced the result?
19
Validity: Does the test measure what it thinks it is measuring? Face validity –Does it look appropriate? Content validity –Does it only assess what you want to assess? Criterion validity –Is it predictive of future performance?
20
Educational impact CurriculumAssessment Student Teacher From: Lambert Schuwirth
21
Issues that affect educational impact The content –What is taught and learned The format –Written, clinical Scheduling –When things happen Regulatory structure –Assessment rules & information
22
Cost Balancing the cost/benefit
23
Acceptance Political issues and support within the faculty Acceptance by the students Perceptions of fairness
24
Models of assessment
25
Formative assessment Provides information to the candidates about their strengths and weaknesses
26
Summative Assessment A measure of an end point achievement Even summative tests can (and should) be formative Must be robust and defensible
27
Assessment guidelines to achieve a high utility index
28
Assessment guidelines Establish the purpose Define what is to be tested Blueprint to guide selection of items Select most appropriate test method/format Administration and scoring Standard setting Newble DI, Jolly B, Wakeford R, eds. The certification and recertification of doctors: issues in the assessment of clinical competence. Cambridge: Cambridge University Press; 1994.
29
Using the guidelines at Sheffield Establish the purpose Define what is to be tested Blueprint to guide selection of items Select most appropriate test method/format Administration and scoring Standard setting To graduate safe/competent junior doctors Outcome objectives for core clinical problems Assessments of knowledge & skills are blueprinted Range of tests used Centralised assessment management Hofstee/borderline methods
30
Strategic management of assessment Assessment Committee Curriculum Committee Phase 4Phase 3Phase 2Phase 1 Administrators/academics
31
Assessments used at Sheffield Formative –On-line weekly tests –Mini Cex Summative –Extended Matching Questions –Modified Matching Questions –Observed long cases –Objective Structured Clinical Examination –Professional behaviour –Assessments of Student Selected Components
32
Formative assessments Weekly on-line tests (EMQs) –Face-to-face feedback –On-line personal results and peer performance
33
Formative assessments Mini Cex (Mini Clinical Evaluation Exercise) –USA: National Board of Medical Examiners –Performance testing in real practice –Multiple observations/multiple observers –Longitudinal assessment to assess professional growth –Feedback is inherent in the assessment method
34
Mini Cex ExcellentSatisfactory Borderline Unsatisfactory History taking Physical examination Communication skills Professionalism Overall clinical competence
35
Summative assessments End of year tests of knowledge –Extended Matching Questions –Modified Essay Questions Tests of clinical competence –Observed long cases –OSCE Assessment of professional behaviour SSCs – reports, presentations, essays, posters, leaflets
36
Observed long case 2 Observed long cases during final clinical attachments Successful completion entry requirement for the OSCE A “mastery” test – may be repeated
37
OSCE Separated from knowledge test by 4 months 12 stations –History - 3 stations –Physical examination - 6 stations –Communication - 2 stations –X-ray interpretation – 1 station Was: 15 stations 5 mins each Now: 12 stations 10 mins each Checklist and global rating scoring
38
Professional behaviours Longitudinal assessment of professional behaviours to give reliability Assessed against outcome objectives and Good Medical Practice (GMC 2001) Borderline/unsatisfactory performance triggers interview Consistent poor performance may lead to review by Fitness to Practise committee and possible exclusion
39
Student Selected Components SSCs 25% of the overall course Beyond the core curriculum (depth +/- breadth) Various modes of assessment Assesses generic skills, critical analysis, ethics, research skills, clinical understanding
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.