Download presentation
Presentation is loading. Please wait.
1
Assessment 101 Zubair Amin MD MHPE
2
“ Assessment Drives Student Learning.”
George E Miller
3
“Assessment drives learning in at least four ways: its content, its format, its timing and any subsequent feedback given to the examinee.” van der Vleuten, C. (1996) The Assessment of Professional Competence: Developments, Research and Practical Implications, Advances in Health Sciences Education, 1, pp. 41–67.
4
“The ‘law’ of educational cause and effect states that: for every evaluative action, there is an equal (or greater) (and sometimes opposite) educational reaction.” Schuwirth, L.W.T. (2001) General Concerns About Assessment. Web address:
5
“Assessment drives learning in the direction you wish.”
Evolution of Medical Students Website by NUS students: “Assessment drives learning in the direction you wish.”
6
Linking Learning, Assessment, and Feedback
7
Assessment Learning, teaching and assessment must not be viewed as isolated concepts. In the ideal scenario effective teaching, effective learning and effective assessment are all part of the same educational process. The value of changing assessment to reflect what needs to be learned is evident since students learn what they know they will be tested on. Doyle, W (1983) Academic Work, Review of Educational Research 52: Over the past 2-3 decades, there has been a great deal of discussion about the nature of assessment. Now we have much broader notions of what assessment should be doing.
8
We should assess what we teach and teach what we assess.
9
Assessment Fundamentals
Why do we assess? What should we assess? When should we assess? How should we assess?
10
Why Do We Assess? Determine whether learning outcomes are met
Support of students’ learning Certification and competency judgment Teaching program development and implementation Accountability Understanding the learning process
11
Assessment Serves Multiple Stakeholders
Students Teachers Department, Faculty; University; Administrators Public; Governmental Agencies Stakeholders’ interest in assessment is not necessarily aligned.
12
Stakeholders’ Priorities
Students Teacher Faculty, University Public, Government
13
What Should We Assess?
14
Knowledge and Performance
Professional authenticity Performance Does Shows how Cognition Knows how Knows Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S7.
15
What Should We Assess? Performance Assessment in vivo Does Shows
Knows how Knows Performance Assessment in vivo Does Performance Assessment in vitro Context-based tests Factual tests
16
Concept of Mastery ‘All or none state’ – not really
When Should We Assess? Date of Examination Concept of Mastery ‘All or none state’ – not really
17
Continuum of Performance
‘Learning Curve’
18
A examination that attempts to test students’ mastery at a given point of time is less preferable than one that tests the mastery over a span of time
19
How Should We Assess? Utility of Assessment Instruments:
Validity Reliability Educational Impact Cost Acceptability Utility = validity x reliability x educational impact x cost effectiveness x acceptability
20
Validity Validity: Ability of the assessment instrument to test what it is supposed to test. Example: The course aims to determine whether the students are able to communicate effectively. What assessment instrument would you choose for the given purpose? Essay
21
Content validity: ability of the assessment instrument to sample representative content of the course. Course content Assessment
22
Reliability Reliability refers to the consistency of test scores and the concept of reliability is linked to specific types of consistency. Over time Between different examiners, Different testing conditions Instruments for student assessment needs high reliability to ensure transparency and fairness
23
Examiner Question Ex 1 Ex 2 Ex 3 Ex 4 Ex 5 Q 1 X Q 2 Q 3 Q 4 Q 5
24
Examiner Question Ex 1 Ex 2 Ex 3 Ex 4 Ex 5 Q 1 X Q 2 Q 3 Q 4 Q 5
25
Examiner Question Ex 1 Ex 2 Ex 3 Ex 4 Ex 5 Q 1 X Q 2 Q 3 Q 4 Q 5
26
Educational impact
27
There is probably more bad practice and ignorance of significant issues in the area of assessment than in any other aspect of higher education. Boud, 1995
28
..… this would not be so bad if it were not for the fact that the effects of bad practice are far more potent than they are for any aspect of teaching. Students can, with difficulty, escape from the effects of poor teaching, they cannot (by definition, if they want to graduate) escape from the effects of poor assessment. Boud, 1995
29
Assessment is a moral activity
Assessment is a moral activity. What we choose to assess and how shows quite starkly what we value. Knight, 1995
30
Assessments need to be reproducible (reliable), valid, feasible, fair, and beneficial to learning;
Content and form of assessments need to be aligned with their purpose and desired outcomes; Student performance is case or content specific and broad sampling is needed to achieve an accurate representation of ability (multiple biopsies); Systematically derived pass-fail scores and the overall reliability of the an assessment are important; and Assessments need to be constructed according to clearly defined standards and derived using systematic and credible methods. Norcini et al; 2011; Med Teach; Criteria for Good Assessment
31
Backbone of Assessment
Select few assessment tools for most the assessment purpose High quality, high psychometric value, and relatively easy to administer Look at clinical competency as a whole (i.e., at the programmatic level Use assessment to create and support learning
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.