Presentation is loading. Please wait.

Presentation is loading. Please wait.

Electronic Marking of Examinations Pete Thomas Open University

Similar presentations

Presentation on theme: "Electronic Marking of Examinations Pete Thomas Open University"— Presentation transcript:

1 Electronic Marking of Examinations Pete Thomas Open University

2 The Context 1.Distance education – Open University, UK 2.On-line study 3.Improving assessment 4.Enhancing the student experience 5.Computing

3 Distance Education: OU Style Open University (HQ) Marker Exam Centre Exam Centre Exam Centre Exam Centre Exam Centre Typical Computing course has 2 – 4 thousand students

4 Motivation The gap between the environment in which students learn and how they are assessed is widening … there is a radical discontinuity in their educational experience. Race et al. (1999)

5 Electronic Assessment Project Aims for Examinations To put the whole examination process on-line. To design an appropriate exam paper. To define the client and server functions. To investigate invigilation issues. To provide automatic marking of free text. To provide useful feedback on answers. To provide induction to the process. To provide support for examiners.

6 Marking Exam Scripts The objects in the examination system: –Questions and sub-questions –Specimen (sample) solutions –Mark schemes –Rubrics (rules) –Student answers Marking consists of –Comparing an answer with a specimen solution and assigning a mark –Applying the rubric

7 Issues Problems with questions –Incomplete, erroneous or inconsistent data –Asking for an inappropriate conclusion –Ambiguity Checking used to remove defects Errors come to light: –When candidates read the question –When examiners attempt to grade answers

8 Issues Problems with answers –Understanding the question –Poorly expressed ideas –Lack of knowledge –Poor language skills (spelling) –Use of abbreviations of own devising –Lack of typing skills Actions to address the problems –Standardization between markers

9 Issues Problems with specimen solutions –Do not match the question (as perceived by candidates) –Numerous acceptable alternatives –Incomplete –Incorrect Actions to address problems –Expect markers to use professional judgement –Standardization between markers

10 Issues Problems with mark schemes –Incorrect –Inappropriate for the question as perceived by candidates –Inappropriate for the difficulty of the question Actions to address problems –Revise in the light of experience –Standardization between markers

11 Electronic Marking A specimen solution can consist of more than one solution. Each solution can have a different mark. A solution can be split into parts with each part having its own marks. In general, an electronic solution can be composed of a set of alternatives and each alternative is composed of parts.

12 Solution Representation Suppose that the solution to a (trivial) question consists of the following three phrases: range memory locations, bounds registers, address outside range exception AND (3, 6.0) bounds registersrange memory locationsaddress outside range exception storage protection keys bounds registers OR (2, 1, 3.0) Synonyms? Thesaurus

13 First Experiment 20 student scripts from a conventional written exam 3 independent human markers Question 712 (a)12 (b)12 (c)12 (d)12 TotalOver all Marks allocated Marker averages Tool averages

14 Second Experiment Electronic mock examination (3 hours) Full exam scripts (10 Part 1 and 3 Part 2 questions) 11 students 3 independent human markers Automatic ScoreMarkers mean score Mean St.Dev Automatic score is lower and there is a smaller standard deviation

15 Deficiencies DescriptionNumber Number of significant spelling errors in answers8 Number of deficient solution trees (out of 39)15 Number of thesaurus deficiencies (synonyms)12 Number of deficient specimen solutions2 Number of lexical deficiencies (abbreviations)1 Number of language parsing errors (software errors)2 Number of deficient questions2

16 Deficiency Reduction Examined one script taken at random and corrected/amended the solution trees and the thesaurus. Repeated the process with 3 more scripts. Automatic MarkerHuman Marker Mean Mean

17 Comparison

18 Correlations Pearson correlationPart 1Part 2Total Tutor 1 & Tutor Tutor 1 & Tutor Tutor 2 & Tutor All tutors & electronic

19 Examiner Support

20 Thesaurus maintenance

21 Exam Building System

22 Further Experiments

23 Revised algorithm

24 Marking Diagrams (April 03) What features should a drawing tool provide? How familiar should students be with the tool prior to the examination? How should the tool be provided to students in order to be used under examination conditions? How should a diagram be represented for transmission to the server? How should a diagram be represented for grading purposes? How to grade a diagram?

25 Exam Question Use the drawing tool to draw a diagram that illustrates how the data hazard inherent in the execution of the pair of instructions ADD R2, R3, R1 SUB R1, R5, R4 by a 4-stage pipeline, can be overcome.

26 Specimen Solution & Drawing Tool

27 Associations & Constraint Multiset Grammars Association Link, Weight where ( exists Box1 Box2 attached (Link.start, Box1.area) & attached (Link.end, Box2.area) ) { Association.from = Box1 & = Box2 & Association.text = Link.text & Association.weight = Weight }

28 CMG for a pipeline Pipeline Association1, Association2, Association3 where ( = Association2.from & = Association3.from & Association1.from.text.string = fetch ADD & Association2.from.text.string = decode & Association3.from.text.string = execute & = write R1 & ) { Pipeline.assoc1 = Association1 & Pipeline.assoc2 = Association2 & Pipeline.assoc3 = Association3 & }

29 Initial Experiment StudentHuman Markers (average) Diagram Marking Tool Mean St. Dev

30 Thank You

Download ppt "Electronic Marking of Examinations Pete Thomas Open University"

Similar presentations

Ads by Google