Presentation is loading. Please wait.

Presentation is loading. Please wait.

© AJC 2004.1/18 Extended Matching Sets Questions for Numeracy Assessments: A Case Study Alan J. Cann Department of Microbiology & Immunology University.

Similar presentations


Presentation on theme: "© AJC 2004.1/18 Extended Matching Sets Questions for Numeracy Assessments: A Case Study Alan J. Cann Department of Microbiology & Immunology University."— Presentation transcript:

1 © AJC 2004.1/18 Extended Matching Sets Questions for Numeracy Assessments: A Case Study Alan J. Cann Department of Microbiology & Immunology University of Leicester.

2 © AJC 2004.2/18 Extended Matching Sets Questions Extended matching sets questions (EMSQs) are a form of multiple choice question (MCQ) consisting of a stem (question or scenario) plus an extended number of possible answers. For this case study, a multiple choice question with ten or more alternative answers is considered to be an EMSQ. Use of the EMSQ format in online assessment of numeracy shows that properly constructed questions of this type can also play a valuable role in assessment of numerical ability.

3 © AJC 2004.3/18 Previous Practice Biological Sciences students at the University of Leicester are required to take several study skills modules. With ~200 students, modules designed to test and improve numeracy skills were delivered over the world wide web. Assessment involved data capture via textbox entry on web forms, and the resulting text file was marked and annotated using Microsoft Excel. Marks and automatically generated personalized comments were returned to students by email merge.

4 © AJC 2004.4/18 Blackboard VLE Since 2003 the Blackboard virtual learning environment (VLE) has been used (www.blackboard.com).www.blackboard.com Previous web content was transferred directly to Blackboard. Delivery consists of a weekly lecture accompanied by online lecture notes. Weekly assessments were delivered and assessed by the Blackboard Assessment Manager and Gradebook tools.

5 © AJC 2004.5/18 Student Feedback Using Blackboard for assessments is quick and easy, lectures were interactive and stimulating. Blackboard is a great way of assessing progress every week, more modules should use this method. I like the fast feedback and the fact I could email any problems I had, and get feedback on that too. Blackboard is an excellent system which makes life easier!

6 © AJC 2004.6/18 Results: MCQ (vs. MCQ) Although the same question bank was used and the cohort was similar in terms of academic ability, the change to the VLE had no significant impact on the marks: Mean Mark (WWW) %St Dev Topic 1 (Algebra)88 (87)12 Topic 2 (Units)63 (63)18 Topic 3 (Molarities)60 (60)24 Topic 4 (Geometry)53 (53)24 Topic 5 (Logs)77 (80)22

7 © AJC 2004.7/18 Problems No provision in Blackboard for accepting a range of numbers, e.g. for an answer of 2.51, accept 2.49-2.6. Numerical answers entered via textboxes were assessed by the VLE using pattern matching. Previous experience indicated that students would enter a variety of answers due to rounding up, number of decimal places used or formatting.

8 © AJC 2004.8/18 Problems All the previously seen answer variants to the question bank deemed to be acceptable were entered. Students were given detailed instructions: –Do not type anything except letters/numbers in the boxes, and a decimal point if necessary (NO SPACES). –Use the same number of decimal places in your answer as are used in the question. –Do not round your answers. –Check that you have used the correct units (as indicated in the question).

9 © AJC 2004.9/18 Problems In spite of detailed instructions and screening of previous answers, new answer variants arose frequently, reflecting the number of possible variants which are technically correct. The result is that technically acceptable answers which did not match any of the anticipated variants were marked incorrect. This resulted in a loss of confidence in the software and formal complaints from students that the assessments were unfair.

10 © AJC 2004.10/18 Student Comments: I didn't like Blackboard as it is not clear enough how to express an answer, e.g. should we express 100 as 100, or 1.00e2 etc. Very unsatisfied with the Blackboard marking system. Make clear the number of decimal places needed. When writing in numbers for exercises allow rounding up. Clearer instructions at the beginning of assessments would be helpful. Marking scheme not so restricted re. decimal places and rounding up.

11 © AJC 2004.11/18 Why not use MCQs? The MCQ format is unsatisfactory for assessment of numeracy. Answering MCQs involves a fundamentally different thought process from entering a calculated number into a textbox. Many students avoid calculations and simply guess the answers by elimination of obviously wrong distractors, eliminating the educational benefits of repeated practice calculations.

12 © AJC 2004.12/18 Methodology In a subsequent module, calculated answers were assessed using an EMSQ format. This gives no clues to the correct answer and forces students to perform a calculation to at least estimate the correct solution. The same question bank was used as in the previous web-based format. The 2003 student cohort was similar to previous years in terms of academic ability (A level entry grades).

13 © AJC 2004.13/18 Example (5%) (10%) (1%) (84%)

14 © AJC 2004.14/18 Results: EMSQ (vs. MCQ) Difference is not statistically significant due to the relatively small number of assessments. Student feedback was far more positive. Students expressed satisfaction with the Blackboard VLE and were confident of the validity of the assessment marking system. Mean Mark (WWW) %St Dev Topic 1 (Data Analysis)85 (77)14 Topic 2 (EDA)93 (87)14 Topic 3 (Correlation)88 (75)13 Topic 4 (Regression)93 (89)9

15 © AJC 2004.15/18 Conclusions EMSQs are an efficient method of assessing numerical ability in large groups of students. The suitability of EMSQs for on-screen reading have not presented any problems. It is much easier to construct a set of multiple distractors for numerical questions than for word-answer MCQs.

16 © AJC 2004.16/18 Conclusions Distractors consist of a related numerical series spanning a wide range of answers: 0.918, 0.818 … –0.818, –0.918 1.023x10 9, 1.023x10 8 … 1.023x10 -8, 1.023x10 -9 This effectively eliminates guessing. Sometimes it is possible for students to select the correct answer by estimation, a practice that we wish to encourage.

17 © AJC 2004.17/18 Summary Textbox:MCQ:EMSQ: Assessment of Numeracy Excellent: No cues, calculation is required. Poor: Cues from distractor range, encourages guessing. Good: Few cues, discourages guessing, encourages calculation. Development of Knowledge Excellent: Reinforcement of numerical ability through repeated calculations. Poor: Encourages completion of assessment with minimal calculation. Good: Encourages calculation, reinforcing numerical ability. Reliability of Assessment Poor: Difficult or impossible to ensure students format answers "correctly" for automated marking. Marking failures cause loss of confidence. Good: Possibility of wrongly formatted answers eliminated, but can give a misleading interpretation of numeracy skills. Excellent: Possibility of wrongly formatted answers eliminated.

18 © AJC 2004.18/18 alan.cann@leicester.ac.uk Education costs money - ignorance costs more.


Download ppt "© AJC 2004.1/18 Extended Matching Sets Questions for Numeracy Assessments: A Case Study Alan J. Cann Department of Microbiology & Immunology University."

Similar presentations


Ads by Google