Presentation is loading. Please wait.

Presentation is loading. Please wait.

Student’s Assessment Dr. A. M. Kadri Associate Professor, PSM dept. PDU Govt. Medical College, Rajkot.

Similar presentations


Presentation on theme: "Student’s Assessment Dr. A. M. Kadri Associate Professor, PSM dept. PDU Govt. Medical College, Rajkot."— Presentation transcript:

1 Student’s Assessment Dr. A. M. Kadri Associate Professor, PSM dept. PDU Govt. Medical College, Rajkot

2

3 “Students read, not to learn but to pass the examination. They pass the examination but they do not learn.” Huxley

4 What is an assessment? Any systemic method of obtaining evidence (from test, examination questionnaires, surveys and collateral source ) used to draw inference about competency of the students for a specific purpose

5 Evaluation Evaluation is a judgment regarding the quality or worth of the assessment results. This judgment is based upon multiple source of assessment information.

6 Qualitative and Quantitative Measurement of students behaviour. + Value judgment = Evaluation

7 For no matter how appealing the statement of goals, how logical the programme organization, how dazzling the teaching method, it was the examination that communicated most vividly to students what was expected of them”. George E. Miller

8 Your experiences with Assessment 8

9 Critical questions in assessment 1. WHY are we doing the assessment? 2. WHAT are we assessing? 3. HOW are we assessing it? 4. HOW WELL is the assessment working?

10  Why we are doing assessment?

11 Purpose of assessment  To determine whether learning objectives prior sets are met.  Support of student learning.  Certification and judgment of competency.  Development and assessment of teaching program.  Understanding the learning process.  Predicting the future performance.

12 Purpose of assessment DEFINE THE MINIMUM ACCEPTED LEVEL OF COMPETENCE  Prove (that he/she is a competent doctor) - SUMMATIVE  Improve (provide feedback regarding shortcomings) – FORMATIVE

13 Purpose of assessment  For selection of a few students from a large number of students.  Preassessment of the need of a learner  For continued monitoring of learning activities for giving a feedback.  For competence to complete a course

14 2. WHAT are we testing?  Elements of competence Knowledge  factual  applied: clinical reasoning Skills  communication  clinical Attitudes  professional behaviour

15 3. How are we doing the assessment?  Essays  Short answer questions  Objective items – supply and selection type.  Simulated patient management problems  Assignments  Practical  Clinical  OSPE  OSCE  Rating scales  Checklists  Questionnaires  Diary and logbook

16 What we can assess? Take a coloscopy to diagnose a broken leg? Totally stupid? The PURPOSE determines the method chose!  Without clarifying the purpose The procedure is ineffective

17 What to Assess ? DomainMethodInstrument Cognitive (knowledge) Written tests Oral Open-ended or essay questions structured essay or MEQ short answer question objective MCQ simulated patient management problems Assignments questions Psychomotor (skills) ObservationsPractical – actual and model clinical cases Objective structured clinical/practical examination

18 What to Assess? DomainMethodInstrument Affective (Attitude) ObservationRating scales check lists questionnaire log book daily evaluation sheets

19 3. What we can assess? Test formats Knows Shows how Knows how Does Knows Factual tests: SBAs, Essay, SAQ Knows how (Clinical) Context based tests: SBAs, EMQs, SAQ Shows how Performance assessment in vitro: OSCEs OSPE Does Performance assessment in vivo: mini-CEX, DOPs

20 4. HOW WELL is the assessment working? Evaluation of assessment systems Is it valid? Is it reliable? Is it doing what it is supposed to be doing? To answer these questions, we have to consider the characteristics of assessment instruments

21 Characteristics of Assessment:  Relevance: Is it appropriate to the needs of the society or system.  Validity: Does the assessment tool really test what it intend to test.  Reliability – Accuracy and consistency  Objectivity: Will the scores obtained by the candidate be same if evaluated by two or more independent experts?  Feasibility: Can the process be implemented in practice?

22 RELEVANCE  Relevance refers to appropriateness of the process of evaluation with reference to the jobs to be performed by the student after qualification and therefore it should reflect the health needs of the society.  Relevance of the process should be obvious both to teachers and the students for the test to be taken seriously and for the results to reflect levels of achievement.

23 VALIDITY Refers to the degree to which a test measures what it intends to measure.  In choosing an instrument, the first question that the teacher should consider is the learning outcome sought to be measured.  Refers both to the results of the test as well as the instrument

24 Factors Influencing Validity of Evaluation Tool. Test factors  Unclear directions  Difficult and ambiguous wording of questions.  Poorly constructed items.  Inappropriate level of difficulty  Inappropriate question for the outcome being measured.  Inappropriate arrangements of items  Identifiable pattern of answers and clues.  Too short or too long a test.  Errors in scoring  Adverse classroom and environmental factors.

25 FACTORS INFLUENCING VALIDITY OF EVALUATION TOOL Student Factors  Adoption of unfair means.  Emotional disturbance.  Lack of motivation.

26 RELIABILITY Consistency with which an instrument measures the variable  Reliability is a measure of reproducibility of the test.  Reliability is a mathematical concept and is a measure of correlation between two sets of scores. To obtain two sets of scores one of three alternatives are available. a. Test-retest: b. Equivalent tests: Two tests of equivalent form can be administered to the students to obtain two sets of scores. c. Split half method: In this a single test is split into two halves (for example odd and even numbered MCQs) and the two sets of scores for each student compared

27 MEASURES TO IMPROVE RELIABILITY  Increase length of test to optimum level.  Use appropriate levels of difficulty and Appropriate levels of discrimination to ensure wide spread of scores.  Maintain conditions of test constant.  Ensure objectivity of scoring.  Ensure validity of the instrument used.

28 OBJECTIVITY  Degree of agreement between the judgment of Independent and competent examiners  Objectivity of the Evaluation process should be maintained Step to increase objectivity of scoring of conventional examinations Structuring of questions Preparation of model answers Agreeing on the marking scheme Having papers independently valued by two or more examiners

29 FEASIBILITY  Considering the ground realities, an evaluation process should be feasible  Factors to be considered in deciding feasibility are  Time and resources required  Availability of an equivalent form of the test for measuring reliability  Ease of administration, scoring and interpretation

30 Aligning Assessment with Objectives: There are two major reasons for aligning assessments with learning objectives. First, alignment increases the probability that we will provide students with the opportunities to learn and practice the knowledge and skills that will be required on the various assessments we design. Second, when assessments and objectives are aligned, “good grades” are more likely to translate into “good learning”. - When objectives and assessments are misaligned, many students will focus their efforts on activities that will lead to good grades on assessments, rather than focusing their efforts on learning what we believe is important.

31 Keep in mind the following questions: What will the student’s work on the activity (multiple choice answers, essays, project, presentation, etc) tell me about their level of competence on the targeted learning objectives? How will my assessment of their work help guide students’ practice and improve the quality of their work? How will the assessment outcomes for the class guide my teaching practice?

32 Systematically designed assessment  Relevant to the curriculum.  Focus on important skills.  Promote learning of skills.  Spell level of attainment.  Discriminate good and poor students.  Provide feedback.

33 Tips For Framing Questions  Clearly define the learning outcome to be assessed.  Be precise and clear  Use explicit terms –Identify, Compare and Contrast, Define, Give Reason etc.  Define the correct answer and scoring in advance  Ensure all the topics are covered  Avoid repetition  Have colleague critically review the question.

34 To sum up, Use about principle of assessment, Know Purpose of assessment, Know the Characteristics of Assessments tool Use Appropriate types of assessment, Align assessment with the objectives.

35 Changing the examination system without changing the curriculum have a much more performed impact upon the nature of learning than changing the curriculum without attending the examination” - G. E. Miller “ Whoever controls the examination controls the curriculum”

36

37 Thank you..


Download ppt "Student’s Assessment Dr. A. M. Kadri Associate Professor, PSM dept. PDU Govt. Medical College, Rajkot."

Similar presentations


Ads by Google