Presentation is loading. Please wait.

Presentation is loading. Please wait.

Kadupitiya JCS Dr. Surangika Ranathunga Prof. Gihan Dias Department of Computer Science and Engineering University of Moratuwa Sri Lanka.

Similar presentations


Presentation on theme: "Kadupitiya JCS Dr. Surangika Ranathunga Prof. Gihan Dias Department of Computer Science and Engineering University of Moratuwa Sri Lanka."— Presentation transcript:

1 Kadupitiya JCS Dr. Surangika Ranathunga Prof. Gihan Dias Department of Computer Science and Engineering University of Moratuwa Sri Lanka

2 Introduction This is part of an initiative to implement automatic grading of secondary school level mathematical answers. Specifically focus on automatic grading of multi- step answers to word type problems. –Written in Sinhala. 2

3 Motivation O/L Exam is a major turning point for a student 40%-50% Students fail Mathematics. O/L Mathematics is identified as a subject that needs “Special Attention” by Ministry of Education. “Making Mathematics a favorite Subject” is an educational goal of 2013. 3

4 Motivation Contd. 4

5 The most common way of preparing for an exam – working on questions Lack of feedback due to –Large classroom size –Inability to afford individual tuition classes Solution – Having a platform where students can get their solutions automatically assessed 5

6 Focus of the research 1.Assess answers (written in Sinhala) of word-type problems using a marking rubric 2.Award full/ partial credit to students 3.Give feedback to students 6

7 Word type problems Short sentences and numerical expressions are included in the answers to this type of questions. Format : – –etc. Examples for these type of questions. Percentage calculations Interest Calculations Measurement calculations(Perimeter, Area, Volume) 7

8 Example Question. 420 000. 8%. = 420000*8/100 = 33600 = (420000 + 33600)*8/100 = 36288 = 420000 + 33600 + 36288 = 489888 8

9 Existing Work Overview of automatic grading A. Bennett - Categorization of Automatic grading of mathematics –Those calling for equations or expressions, where the problem has a single mathematically correct answer that can take many different surface forms –Those calling for one or more instances from a potentially open-ended set of numeric, symbolic, or graphical responses that meet a given set of conditions –Those requiring the student to show the symbolic work leading to the final answer –Those asking for a short text explanation 9

10 Existing Works Contd. Mathematical Expression Evaluation A. Lan - Mathematical Language Processing (MLP) –It supports partial automatic grading of mathematical expressions. –They have used a clustering based approach using affinity propagation algorithm. –They have not considered the questions that have answers with natural language phrase and a expression separated by equality sign M. Badger - A System implemented to support automatic grading for Mathematics questions in GCSE examination UK. –They have used a limited interface to get the solution from the student in a step- wise manner. –It only focuses on mathematical expressions and numerical calculations 10

11 Existing Work Contd. Existing Auto grading Systems M-rater –M-rater scoring engine is used for scoring open-ended mathematical responses, such as mathematical expressions, equations and graphs. –System is limited to take only string or numeric responses via single or multiple text boxes. OpenMark –Evaluates multi-step answers –Uses a CAS to evaluate each step against correct answer IMathAS –Uses a Computer Algebra System –Evaluates steps of an answer Doesn’t support Mathematical Word Type Problems 11

12 Existing Work Contd. Computer Representation of Marking Rubric A. Li - Mathematics Assessment Grid To integrate the mathematical question resources on Internet into a very large, open and virtual questions library. XML+ MathML + SVG = Mathematics Assessment Markup Language (MAML) We have adopted their research with necessary changes 12

13 Existing Work Contd. Short Sentence Similarity Needed to validate the students final answer units A. Wael - String Based and Corpus Based Similarity –Natural language processing methods such as raw, stop, stem and stopstem used for pre processing –String Similarity (Levenshtein, Jaro, Jaro–Winkler, N-gram, etc.) –Corpus-Based Similarity- Latent Semantic Analysis (LSA) –We have adapted these techniques for Sinhala language M. Mohler-Semantic Similarity –Text-to-text Semantic Similarity for Automatic Short Answer Grading –Used knowledge-based measures and corpus-based measures with Word Net –Supervised approach to improve their earlier results obtained using WordNet based approaches. –We cant use these approaches as “Sinhala” wordnet is still under the implementation 13

14 Methodology Expression Evaluation Data Preparation Computer Representation of the Marking Rubric, Question, Unit Categorization and the Answer Implementation –Expression Validation Module –Units Validation Module –Automatic Grading Module 14

15 Data Preparation Interest Calculation Question Sub questions = 05 Percentage Calculation Question Sub questions = 05 60 Students ( secondary school level ) 300 Answers Tuning Data Set = 100 Answers Testing Data Set = 500 Answers 15 600 Answers

16 Computer Representation of the Marking Rubric, Question, Unit Categorization and the Answer Documents used for manual grading are in paper based documents, thus cannot be directly used for automatic grading. Use XML and MathML based representation for these paper based documents 16

17 Implementation Expression Validation Module Automatic Grading Module Units Validation Module Overview of the system 17

18 Expression Validation Module Students may write some random expressions that will finally get evaluated to the final answer. Example Question : A person allocate 30% of Rs. 600000 for a stock purchase. Prove that the amount he allocated was Rs. 180000. Student answer: Allocations= 90000 + 90000 + 0*600000*0.3 = Rs. 180000 18

19 Expression Validation Module contd. 19

20 Units Validation Module To offer full marks for a particular question it should be correct numerically as well as unit wise. Numerical part grading is done by the automatic grading module while unit checking is done in the unit validation module. Partial marks are awarded for the final answer if both numerical grading and unit validation modules return true. Levenshtein (edit) distance based approach is used. 20

21 Units Validation Module contd. 21

22 Automatic Grading Module This is the main module of the implemented system. Incorporates expression validation module to grade the expression of the students answer. According to the marking rubric, system decides whether to award marks to an expression or a numeric answer. For final answer grading, it only checks for numerical value equality and unit similarity. 22

23 Automatic Grading Module contd. 23

24 Results and discussion Grading module was tested with 500 answers for two word type questions, each having five sub-questions. These two questions were perfectly graded and gave the accuracy of 99%. When considering the sub questions, the accuracy was 99.8%. Manual grading vs. automatic grading for question 01Manual grading vs. automatic grading for question 02 24

25 Results and discussion contd. Reason for the accuracy drop A person spent 30% of Rs. 600000 to buy the shares of an organization for Rs 24 per share. The nominal value of a share is Rs. 25. Show that the Nominal value of the bought shares is Rs. 187500. Correct Answer: Amount of shares=180000/24 = shares 7500 Nominal value of the Bought shares=7500*25 =Rs.187500 Ambiguous Answer: Value of a Bought share=180000/24 =Rs. 7500 Nominal value of the Bought shares = Rs. 7500 + 180000 =Rs.187500 25

26 Future Work Complete the Sinhala phrase validation module. Identify the type of the errors which students make. Give specific feedback on the error. 26

27 Thank You 27

28 28 Questions?


Download ppt "Kadupitiya JCS Dr. Surangika Ranathunga Prof. Gihan Dias Department of Computer Science and Engineering University of Moratuwa Sri Lanka."

Similar presentations


Ads by Google