© AJC 2004.1/18 Extended Matching Sets Questions for Numeracy Assessments: A Case Study Alan J. Cann Department of Microbiology & Immunology University.

Slides:



Advertisements
Similar presentations
Completing the cycle: an investigation of structured reflection as a tool to encourage student engagement with feedback Jackie Pates Lancaster Environment.
Advertisements

FEEDBACK! WHATS FEEDBACK? Did you know that..... Feedback you receive is not just confined to coursework and formal assessments. It will not always come.
An Introduction to Computer- assisted Assessment Joanna Bull and Ian Hesketh CAA Centre Teaching and Learning Directorate.
Auditing Subject Knowledge in Initial Teacher Training using Online Methods.
Skills Assessment Software that enables a quick and accurate assessment of a candidates skills in the following areas: Typing MS Office Debtors and Creditors.
‘INCENTIVISED READING’-USING MASTERING BIOLOGY TO ENCOURAGE EARLIER ENGAGEMENT BY STUDENTS Louise Beard School of Biological Sciences University of Essex.
Chapter 07: Lecture Notes (CSIT 104) 1111 Exploring Microsoft Office Excel 2007 Chapter 7 Data Consolidation, Links, and Formula Auditing.
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Evaluating Epidemiological Data A case study in web-based learning. Glyn Jones York College.
Margaret Home, Dianne McNab MARCET, L&LS Michelle Walker Subject & Liaison, L&LS.
M ATCHING I TEMS Presenter Pema Khandu B.Ed.II(S)Sci ‘B’
© AJC /11 Virtual Courses in the Big Room Alan J. Cann Department of Microbiology & Immunology University of Leicester. Big Room: noun. (Also: Big.
1 The Learning Review Table (LRT) Bruce King LSGI (with thanks to Ada Lee)
© AJC /16 Assessed Online Discussion Groups: Making Web 2.0 Work Alan J. Cann Department of Biology University of Leicester.
The use of a computerized automated feedback system Trevor Barker Dept. Computer Science.
Peer assessment of group work using WebPA Neil Gordon Symposium on the Benefits of eLearning Technologies University of Manchester, in conjunction with.
School of Modern Languages and Cultures Developing potential through independent learning using the VLE Dr Honor Aldred Dr M Chiara La Sala.
By Dr Razia Khatoon Dr Noor Jahan 1 12/9/2014 5th Basic Course Workshop in Medical Education Technologies.
The development of lessons, case studies, scenarios and simulations in the Moodle Virtual Learning Environment for self directed learning (SDL) By Michael.
Capital Budgeting For 9.220, Term 1, 2002/03 02_Lecture8.ppt.
The value of e-assessment in interprofessional education and large student numbers Melissa Owens* John Dermo* Fiona MacVane Phipps * Presenters.
Analysis of usage statistics in the Virtual Learning Environment Moodle shows that provision of learning resources significantly improves student grades.
Objective Examination Dr. Niraj Pandit MD Department of Community Medicine SBKS MIRC.
Miguel Martins, Senior Lecturer – Marketing Department eLearning Celebration 22 July 2007 Improving performance on a Marketing module through the use of.
Stages of testing + Common test techniques
Using Assessment Within Virtual Learning Environments Colin milligan.
Interactive Science Notebooks: Putting the Next Generation Practices into Action
Classroom Response System How to promote student engagement? AMA Day, Saturday 5 th April Rachel Passmore, University of Auckland.
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
MASTERS THESIS DEFENSE QBANK A Web-Based Dynamic Problem Authoring Tool BY ANN PAUL ADVISOR: PROFESSOR CLIFF SHAFFER JUNE 2013 Computer Science Department.
Challenge the future Delft University of Technology Digital testing: What do students think of it? E-merge, November 2014 Ir. Meta Keijzer-de.
Marquee Series 2013 The Marquee Series: Builds students’ skills in Microsoft Office through a visual, point-and-click approach. Offers a learning solution.
UNIVIRTUAL FOR INSTRUCTIONAL DESIGN Versione 00 del 29/07/2009.
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
Online assessment: the use of web based self-assessment materials to support self-directed learning Online assessment: the use of web based self-assessment.
1 Bacon – T. A. Webinar – 7 March 2012 Transforming Assessment with Adaptive Questions Dick Bacon Department of Physics University of Surrey
CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CHAPTER 8 AMY L. BLACKWELL JUNE 19, 2007.
Evaluation of Respondus assessment tool as a means of delivering summative assessments A KU SADRAS Project Student Academic Development Research Associate.
Term 2, 2011 Week 6. CONTENTS Validating data Formats and conventions – Text – Numerical information – Graphics Testing techniques – Completeness testing.
What is On Demand Testing? How does it work? What information can you get from the Adaptive and Linear Tests? How can you use the information to create.
Session 2 Traditional Assessments Session 2 Traditional Assessments.
1 Phillips Andover Academy 2/22/ :30 – 1:00 Darek Sady Blackboard Learning System (Release 6.3) Assessments, Surveys, and Pools.
Using Learning Technology to Design and Administer Student Assessments Catherine Kane Centre For Learning Technology, CAPSL Ref:
International Diabetes Federation (IDF) East Mediterranean and Middle East Region (EMME) Workshop on Professional Educational Methodology in Diabetes
Assessment and Testing
Copyright © 2008 Pearson Prentice Hall. All rights reserved Copyright © 2008 Prentice-Hall. All rights reserved. Committed to Shaping the Next.
Heriot Watt University Breakout Session MCQ with Confidence Dr Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University.
The role of CAA in Helping Engineering undergraduates to Learn Mathematics David Green, David Pidcock and Aruna Palipana HELM Project Mathematics Education.
PSAT/NMSQT th Grade Advisement. What is the PSAT? The Preliminary SAT/National Merit Scholarship Qualifying Test (PSAT/NMSQT) The Preliminary.
Inline Tutorial Self Assessments: Improving Satisfaction (and possibly learning) Robert Schudy Dan Hillman.
This material is based upon work supported by the National Science Foundation under Grant No and Any opinions, findings, and conclusions.
“The End Justifies the Means” When and how to use Computerised Assessment to Promote Learning Phil Davies School of Computing.
Development of a Web-Based Groupwork Assessment Tool Groupwork and Assessment Methods Demonstration of Software Discussion Hannah Whaley David Walker
Information Technology Design & Multimedia. A Comparison of an Innovative Web-based Assessment Tool Utilizing Confidence Measurement to the Traditional.
Materials produced under Phare 2006 financial support Phare TVET RO 2006/ EUROPEAN UNION Project financed under Phare MoERI/ NCDTVET-PIU.
Language Testing How to make multiple choice test.
In-Service Teacher Training Assessment in IGCSE Biology 0610 Session 2: Question papers and mark schemes.
Unit 13 – Website Development FEATURES OF WEBSITES.
Learning Aim A.  Websites are constructed on many different features.  It can be useful to think about these when designing your own websites.
Applying Laurillard’s Conversational Framework to Blended Learning Blogging and Collaborative Activity Design R Papworth, R Walker & W Britcliffe E-Learning.
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
E-poster presentation UWE Learning and Teaching Conference Effective Assessment Feedback 15 th January 2013.
Objective Examination: Multiple Choice Questions Dr. Madhulika Mistry.
Creating Assessments that Engage Students & Staff Professor Jon Green.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Use of Blackboard™ Grade Centre for Immediate Feedback Helen Higson.
Can a grade based approach improve our assessment practice?
EDU 385 Session 8 Writing Selection items
Managed Learning and Assessment Environments A strategic approach for the provision of Transnational Education Dr Ray Stoneham Professor Mayur Patel Faculty.
Summative Assessment Grade 6 April 2018 Develop Revise Pilot Analyze
Presentation transcript:

© AJC /18 Extended Matching Sets Questions for Numeracy Assessments: A Case Study Alan J. Cann Department of Microbiology & Immunology University of Leicester.

© AJC /18 Extended Matching Sets Questions Extended matching sets questions (EMSQs) are a form of multiple choice question (MCQ) consisting of a stem (question or scenario) plus an extended number of possible answers. For this case study, a multiple choice question with ten or more alternative answers is considered to be an EMSQ. Use of the EMSQ format in online assessment of numeracy shows that properly constructed questions of this type can also play a valuable role in assessment of numerical ability.

© AJC /18 Previous Practice Biological Sciences students at the University of Leicester are required to take several study skills modules. With ~200 students, modules designed to test and improve numeracy skills were delivered over the world wide web. Assessment involved data capture via textbox entry on web forms, and the resulting text file was marked and annotated using Microsoft Excel. Marks and automatically generated personalized comments were returned to students by merge.

© AJC /18 Blackboard VLE Since 2003 the Blackboard virtual learning environment (VLE) has been used ( Previous web content was transferred directly to Blackboard. Delivery consists of a weekly lecture accompanied by online lecture notes. Weekly assessments were delivered and assessed by the Blackboard Assessment Manager and Gradebook tools.

© AJC /18 Student Feedback Using Blackboard for assessments is quick and easy, lectures were interactive and stimulating. Blackboard is a great way of assessing progress every week, more modules should use this method. I like the fast feedback and the fact I could any problems I had, and get feedback on that too. Blackboard is an excellent system which makes life easier!

© AJC /18 Results: MCQ (vs. MCQ) Although the same question bank was used and the cohort was similar in terms of academic ability, the change to the VLE had no significant impact on the marks: Mean Mark (WWW) %St Dev Topic 1 (Algebra)88 (87)12 Topic 2 (Units)63 (63)18 Topic 3 (Molarities)60 (60)24 Topic 4 (Geometry)53 (53)24 Topic 5 (Logs)77 (80)22

© AJC /18 Problems No provision in Blackboard for accepting a range of numbers, e.g. for an answer of 2.51, accept Numerical answers entered via textboxes were assessed by the VLE using pattern matching. Previous experience indicated that students would enter a variety of answers due to rounding up, number of decimal places used or formatting.

© AJC /18 Problems All the previously seen answer variants to the question bank deemed to be acceptable were entered. Students were given detailed instructions: –Do not type anything except letters/numbers in the boxes, and a decimal point if necessary (NO SPACES). –Use the same number of decimal places in your answer as are used in the question. –Do not round your answers. –Check that you have used the correct units (as indicated in the question).

© AJC /18 Problems In spite of detailed instructions and screening of previous answers, new answer variants arose frequently, reflecting the number of possible variants which are technically correct. The result is that technically acceptable answers which did not match any of the anticipated variants were marked incorrect. This resulted in a loss of confidence in the software and formal complaints from students that the assessments were unfair.

© AJC /18 Student Comments: I didn't like Blackboard as it is not clear enough how to express an answer, e.g. should we express 100 as 100, or 1.00e2 etc. Very unsatisfied with the Blackboard marking system. Make clear the number of decimal places needed. When writing in numbers for exercises allow rounding up. Clearer instructions at the beginning of assessments would be helpful. Marking scheme not so restricted re. decimal places and rounding up.

© AJC /18 Why not use MCQs? The MCQ format is unsatisfactory for assessment of numeracy. Answering MCQs involves a fundamentally different thought process from entering a calculated number into a textbox. Many students avoid calculations and simply guess the answers by elimination of obviously wrong distractors, eliminating the educational benefits of repeated practice calculations.

© AJC /18 Methodology In a subsequent module, calculated answers were assessed using an EMSQ format. This gives no clues to the correct answer and forces students to perform a calculation to at least estimate the correct solution. The same question bank was used as in the previous web-based format. The 2003 student cohort was similar to previous years in terms of academic ability (A level entry grades).

© AJC /18 Example (5%) (10%) (1%) (84%)

© AJC /18 Results: EMSQ (vs. MCQ) Difference is not statistically significant due to the relatively small number of assessments. Student feedback was far more positive. Students expressed satisfaction with the Blackboard VLE and were confident of the validity of the assessment marking system. Mean Mark (WWW) %St Dev Topic 1 (Data Analysis)85 (77)14 Topic 2 (EDA)93 (87)14 Topic 3 (Correlation)88 (75)13 Topic 4 (Regression)93 (89)9

© AJC /18 Conclusions EMSQs are an efficient method of assessing numerical ability in large groups of students. The suitability of EMSQs for on-screen reading have not presented any problems. It is much easier to construct a set of multiple distractors for numerical questions than for word-answer MCQs.

© AJC /18 Conclusions Distractors consist of a related numerical series spanning a wide range of answers: 0.918, … –0.818, – x10 9, 1.023x10 8 … 1.023x10 -8, 1.023x10 -9 This effectively eliminates guessing. Sometimes it is possible for students to select the correct answer by estimation, a practice that we wish to encourage.

© AJC /18 Summary Textbox:MCQ:EMSQ: Assessment of Numeracy Excellent: No cues, calculation is required. Poor: Cues from distractor range, encourages guessing. Good: Few cues, discourages guessing, encourages calculation. Development of Knowledge Excellent: Reinforcement of numerical ability through repeated calculations. Poor: Encourages completion of assessment with minimal calculation. Good: Encourages calculation, reinforcing numerical ability. Reliability of Assessment Poor: Difficult or impossible to ensure students format answers "correctly" for automated marking. Marking failures cause loss of confidence. Good: Possibility of wrongly formatted answers eliminated, but can give a misleading interpretation of numeracy skills. Excellent: Possibility of wrongly formatted answers eliminated.

© AJC /18 Education costs money - ignorance costs more.