WP5: Validation of developed functionality Report on the final Experiments Anne de Roeck Diane Evans James Gray The Open University - UK.

Slides:



Advertisements
Similar presentations
Fatma Y. ELDRESI Fatma Y. ELDRESI ( MPhil ) Systems Analysis / Programming Specialist, AGOCO Part time lecturer in University of Garyounis,
Advertisements

Performance Assessment
Cultural Heritage in REGional NETworks REGNET Project Meeting Content Group Part 1: Usability Testing.
Multilinguality & Semantic Search Eelco Mossel (University of Hamburg) Review Meeting, January 2008, Zürich.
Qualifications Update: Engineering Science Qualifications Update: Engineering Science.
Introducing Unit Specifications and Unit Assessment Support Packs Classical Studies National 3, 4 and 5.
Enabling successful communication of geographical understanding in written assessments AE SIG GA Conference 2013.
Skills development in the study of a world religion
Module 2 Sessions 10 & 11 Report Writing.
The Cost of Authoring with a Knowledge Layer Judy Kay and Lichao Li School of Information Technologies The University of Sydney, Australia.
SDMX in the Vietnam Ministry of Planning and Investment - A Data Model to Manage Metadata and Data ETV2 Component 5 – Facilitating better decision-making.
Agent-Based Architecture for Intelligence and Collaboration in Virtual Learning Environments Punyanuch Borwarnginn 5 August 2013.
What is Primary Research and How do I get Started?
Assessment & Evaluation adapted from a presentation by Som Mony
Session Outline: 1. Research Strategy - the 8 steps including: Finding information on the subject guide Searching the library catalogue Searching online.
You can use this presentation to: Gain an overall understanding of the purpose of the revised tool Learn about the changes that have been made Find advice.
Seminar VI Organizing a Class II Planning a Class! Ron Welch.
Academic Writing Writing an Abstract.
1 © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training English K-6 Syllabus Using the syllabus for consistency of assessment.
Module 12 WSP quality assurance tool 1. Module 12 WSP quality assurance tool Session structure Introduction About the tool Using the tool Supporting materials.
1 A Systematic Review of Cross- vs. Within-Company Cost Estimation Studies Barbara Kitchenham Emilia Mendes Guilherme Travassos.
Natalie Fong English Centre, The University of Hong Kong Good Practices in a Second Language Classroom: An Alternating Use of ICT in Independent Learning.
A2 Unit 4A Geography fieldwork investigation Candidates taking Unit 4A have, in section A, the opportunity to extend an area of the subject content into.
General Update. New Quality Assurance arrangements Will cover new National 1 to National 5 from 2013/14 New arrangements will promote shared understanding.
Lecture Roger Sutton 21: Revision 1.
The use of a computerized automated feedback system Trevor Barker Dept. Computer Science.
Embedding NVivo in postgraduate social research training Howard Davis & Anne Krayer 6 th ESRC Research Methods Festival 8-10 July 2014.
Understanding Standards: Biology An Overview of the Standards for Unit and Course Assessment.
OER Case Study TJTS569 Advanced Topics in Global Information Systems Savenkova Iuliia.
EVIDENCE BASED WRITING LEARN HOW TO WRITE A DETAILED RESPONSE TO A CONSTRUCTIVE RESPONSE QUESTION!! 5 th Grade ReadingMs. Nelson EDU 643Instructional.
The phases of research Dimitra Hartas. The phases of research Identify a research topic Formulate the research questions (rationale) Review relevant studies.
Assessment Activities
Proposal Writing.
Semantic Web Technologies Lecture # 2 Faculty of Computer Science, IBA.
6 th semester Course Instructor: Kia Karavas.  What is educational evaluation? Why, what and how can we evaluate? How do we evaluate student learning?
Year 7 Independent Learning Task 1
1 DEVELOPING ASSESSMENT TOOLS FOR ESL Liz Davidson & Nadia Casarotto CMM General Studies and Further Education.
Competency #6 MTT Preparation Manual. Competency #6 The master technology teacher demonstrates knowledge of how to communicate in different formats for.
Evaluation and Testing course: Exam information 6 th semester.
Using formative assessment. Aims of the session This session is intended to help us to consider: the reasons for assessment; the differences between formative.
ZUZANA STRAKOVÁ IAA FF PU Pre-service Trainees´ Conception of Themselves Based on the EPOSTL Criteria: a Case Study.
Qualifications Update: Environmental Science Qualifications Update: Environmental Science.
Evaluating a Research Report
Copyright © 2009 Intel Corporation. All rights reserved. Intel, the Intel logo, Intel Education Initiative, and the Intel Teach Program are trademarks.
WP5: Validation Anne De Roeck Diane Evans The Open University, UK.
Introducing Unit Specifications and Unit Assessment Support Packs MUSIC TECHNOLOGY National 3 to 5.
Elaine Ménard & Margaret Smithglass School of Information Studies McGill University [Canada] July 5 th, 2011 Babel revisited: A taxonomy for ordinary images.
Qualifications Update: Sociology Qualifications Update: Sociology.
Qualifications Update: Human Biology Qualifications Update: Human Biology.
Educator’s view of the assessment tool. Contents Getting started Getting around – creating assessments – assigning assessments – marking assessments Interpreting.
Qualifications Update: Higher Media Qualifications Update: Higher Media.
Certificate IV in Project Management Assessment Outline Course Number Qualification Code BSB41507.
Improve Own Learning and Performance. Progression from levels 1-3 Progression from levels 1-3 At all levels, candidates are required to show they can.
Teaching English with Technology. A little bit of history…. Web – 1970: Tape recorders, laboratories – 1970: Tape recorders, laboratories.
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
Understanding Standards: Advanced Higher Event
Information for Parents Statutory Assessment Arrangements
Understanding Standards: Advanced Higher Statistics
Information for Parents Key Stage 3 Statutory Assessment Arrangements
Lecture 23 Collecting Primary data through Questionnare
Research Skills.
Information for Parents Statutory Assessment Arrangements
Advanced Higher Modern Languages
In-Service Teacher Training
Prepared by: Toni Joy Thurs Atayoc, RMT
New Specimen Papers.
An Introduction to e-Assessment
WJEC GCE Geography Guidance for Teachers: Assessment at AS/A level.
An overview of course assessment
Presentation transcript:

WP5: Validation of developed functionality Report on the final Experiments Anne de Roeck Diane Evans James Gray The Open University - UK

Outline Introduction Business hypotheses Student Scenario (QUIZ) Results Desktop Activitiy Results Tutor Scenario Results Final Conclusions

Overall Hypothesis The augmentation of e-learning systems (in this case ILIAS) with NLP and semantic web technologies increases the effectiveness of learning and teaching, and in particular, Increases the effectiveness of teachers and learners in locating relevant learning objects in the context of learning related tasks.

Introduction Changes were made to the methodology after the previous tests and review. Measuring learning Performing Quantitative tests across the scenario activities Defined Hypotheses and formulated tests to measure Introduced Mitigation controls Also included value judgements as supporting analysis Restriction to 2 scenarios (1 student & 1 tutor) Scenarios built on experiences of previous trials Data captured electronically

Student Hypotheses

Retrieval of Content (Student) 1.1 The addition of functionalities based on NLP increases the retrievability of learning objects in terms of the relevance of the content. [Quant test] 1.2 The addition of functionalities based on NLP increases the effectiveness of learners in locating relevant learning objects in the context of answering Quiz questions. [Quant test]

Support for Learning 2.1 The developed tools and ontology help to support learners to better, more effectively grasp the terminological and conceptual space which defines a certain domain of knowledge. [Quant test]

Multi-lingual study 3.1 The ability to easily retrieve content in more than one language supports learning activities of students who are studying in a Multi-lingual situation e.g. the language of study is not their native language. [Quant test]

Support for Learning Paths and Personalised Learning 4.1 Learners are able to build individual learning paths by entering key terms of concepts they need to learn. [Qual. test] 4.2 Learners are able to classify and order learning material (in the form of documents) they have placed within their personal desktop by using concepts and keywords linked to each document [Qual. test]

Cont The facility for a learner to classify and order learning material supports self-guided learning. [Qual. test] 4.4 The facility for a learner to classify and order learning material supports self-guided learning by enabling the user to create a meaningful / linked path through the content. [not tested]

Basis of Student scenario The student scenario focussed on: Demonstrating improvement in learning (as requested by the Reviewers) The LT4eL search functions i.e. Semantic, Definition, Keyword and Ontology Browsing. Multi-lingual search and retrieval associated with these search functions The Use of the Personal Desktop to support individual learning The scenario centred around a multiple choice QUIZ

Structure and rationale of the student scenario

Target - 1st year undergraduate humanities students Topics - Information exchange: protocols and markup languages You will shortly be studying a short course covering 'Information Exchange on the Internet'. This is a subject on which you have little or no previous knowledge. Your tutor has prepared some research questions for you in the form of a quiz to enable you to gain a basic grounding in key aspects. Depending on your group you will either use internet facilities such as Google and Wikipaedia etc or you will have access to different ILIAS search facilities to assist your search, including the LT4el searches and the structured concept browser.

Experiment design - students To assess whether technologies "increase the effectiveness of learning", students needed to do some learning: Learning experience - Quiz to encourage students to engage with learning objects Learning experience - Quiz to encourage students to engage with learning objects

Assessing the learning Assessment Activity Pre-Test Assessment Activity Pre-Test Learning experience (quiz) Learning experience (quiz) Assessment Activity Post-Test Assessment Activity Post-Test The answers provided by the student at each of the 3 stages were marked

Measuring the effect of our technology We introduced a control group with a different learning experience: Learning experience (quiz) ILLIAS / LT4eL Learning experience (quiz) ILLIAS / LT4eL Learning experience (quiz) Internet / Google Learning experience (quiz) Internet / Google TARGET CONTROL

To compare NLP technologies with plain search in the same repository PreTest Quiz Part 1 - ILIAS Quiz Part 2 - LT4eL PostTest PreTest Internet/Google PostTest TARGETCONTROL

Test A Question Set A ILIAS Question Set A ILIAS Controlling for question differences (eg is the post-test easier than the pre-test?): Question Set B LT4eL Question Set B LT4eL Test B Question Set B ILIAS Question Set B ILIAS Question Set A LT4eL Question Set A LT4eL Test A Quiz Part 1 Quiz Part 2 PreTest Post- Test Test A Question Set A WWW Question Set A WWW Question Set B WWW Question Set B WWW Test B Question Set B WWW Question Set B WWW Question Set A WWW Question Set A WWW Test A Group 2Group 3Group 4Group 1 TARGETCONTROL

Pre & Post-Tests Five questions testing understanding of relationships between concepts Pre-test and Post-test questions were paired, to test the same area Pre-test and Post-test questions were alternated in different groups to ensure fairness

Quiz Seven questions with learning materials in the same topics as the test questions designed to encourage engagement. Each question followed by supplementary questions to ascertain: whether student already knew the answer source document (including language) which tool (if any) helped find the answer Quiz is time-limited Question groups alternate between student groups

Results of Student Experiments

Main Student Scenario Searching Learning Multilinguality

Student Scenario: Searching Hypothesis 1.1: The addition of functionalities using / based on NLP increases the retrievability of learning objects in terms of the relevance of their content. Hypothesis 1.2: The addition of functionalities using / based on NLP increases the effectiveness of learners in locating relevant learning objects in the context of answering Quiz questions. Student: Searching

Test that mean scores in Part 2 of the quiz (LT4eL) are greater than mean scores in Part 1 of quiz (plain LMS) for the target group. Student: Searching Searching - Test 1 Test 1

Students scored higher in the quiz when they had access to LT4eL technologies to help them. Student: SearchingTest 1 Results

Was it the technology that helped? Question differences? no - rotating banks of questions Order effect? might expect students to be learning, thus achieve better results in second half of quiz. Test 1a: test by looking for improvement in control group Student: SearchingTest 1a

Hypothesis Gains achieved in both groups Control group scored better in both quizzes Student: SearchingTest 1a Results

Searching - Conclusions so far Students scored higher in the quiz when they had access to LT4eL technologies to help them. Partly explained by order effect - students do better in second half of quiz anyway Students using resources of World Wide Web score higher than those restricted to learning objects within an SMS LMS score improvement appears slightly better than WWW score improvement Student: SearchingConclusions 1

Searching - test 2 Test that for any given question in Part B of the quiz, students who choose to use LT4eL functionality score more highly for that question than those who do not use LT4eL functionality. Note: In Part 2 of the quiz, students are free to use whichever search method they feel will be most effective. This test is to assess whether those who use LT4eL methods score more highly than those who do not. Student: SearchingTest 2

Only three questions answered better using LT4eL functionality Student: SearchingTest 2 Results

Student: SearchingTest 2a Results

(target groups only) Student: SearchingOpinions 2

What did they dislike about Semantic Search? It didn't return relevant results. because it doesn't find what I am searching for its too vague i didn't use it a lot as the results were chaotic i find it not much to the point for the types of research i usually do It is a bit too much to offer this much search methods the name semantic is confusing i liked this type best.. it was the easier to find the relevant information Student: SearchingComments: Semantic

What did they dislike about the Concept Browser? It didn't return relevant results It was too slow for my part and did not give any additional value it's a roundabout way of searching was not eay to use it. maybe this was because i did not fully understand how the concept browser worked I am not sure what the concept browser is don't know what it is. for content questions this might be a relevant search method. However, less relevant when studying a language I like that method the most but it wouldn't be useful in my studies - English philology it helps a lot to understand given topic/term Student: SearchingComments: Concept Br

Student: SearchingOpinions 1

Searching - Test 3 Test that for any given question in Part B of the quiz, students who choose to use LT4eL functionality and answer correctly, do so more quickly than those who answer the question correctly using Full Text Search Notes: In Part 2 of the quiz, students are free to use whichever search method they feel will be most effective. This test is to assess whether those who use LT4eL methods are faster than those who do not. Only consider students who searched for an answer, and answered correctly Student: SearchingTest 3

Students answered faster using LT4eL technologies than using full text search. Student: SearchingTest 3 result

Student: Searching

Searching - conclusions Using Semantic Search, Concept Browser and Definition Finder to find answers, students score better than using plain text search Students appreciate all functionality, especially the Definition Finder; Students' ratings of Semantic Search and Concept Browser, though favourable, showed that these techniques were less familiar to them. Students answered faster using LT4eL technologies than using full text search; speed of searching seems to improve with practice. ConclusionsStudent: Searching

Hypothesis 2.1: The developed tools and ontology help / support learners to better / more effectively grasp the terminological and conceptual space which shape a certain domain of knowledge. Student: LearningTest 1 Test: Assess learning by testing that the mean score in the post-test is higher than the mean score in the pre-test

Learning has taken place. This validates an important part of the methodology. Student: LearningTest 1 - result

Learning - test 2 Test that mean improvement in test scores for target sample (using ILIAS and LT4eL) is higher than mean improvement in test scores for control group (using Internet to find content). Note: This test is intended to show whether learning is more effective in the target sample than in the control sample. Student: LearningTest 2

Learning appears slightly greater in Control group; but they started from slightly lower base. But not statistically significant Student: LearningTest 2 - result

Learning - Test 3 Test whether students who use the LT4eL functionality in the LMS learn more than those who do not. Student: LearningTest 3

Student: LearningTest 3 - result

Learning - Conclusions Learning took place in target and control group Students in the target group who used LT4eL functionality extensively, learned more than those who stuck to plain text search. Student: LearningConclusions

Multilinguality Hypothesis: The ability to easily retrieve content in more than one language supports learning activities of students who are studying in a Multi-lingual situation. Student: Multilinguality Notes: We considered test persons who are capable to read and understand documents in other languages than their native. In normal situation we expect also students who are studying in other language that their native (exchange students) to make use of the multilingual facilities and find documents in their language For our tests this was however not replicable as we would not have been able to ensure a relevant sample size

For the quiz: Allow students to select most appropriate documents to answer questions, irrespective of language Students will retrieve greater linguistic diversity of documents Students who find documents in other languages will score highly Finding Documents Getting correct answers Student: MultilingualityTests

Multilinguality - Test 1 Searching Test: Allow students in a quiz to select the most appropriate document to answer the questions, irrespective of language. We expect those using NLP-based functionalities to select a higher proportion of documents in a non- course language Test 1Student: Multilinguality

Test 1 result

Conclusions so far Target group retrieved more non-course language documents when they had access to LT4eL functionality, showing that multilingual access supported the learning activity No significant increase in non-course language retrieval for control group, suggesting this is not an order effect Student: Multilinguality

Test: test that, of students who are studying in a non-native language, those who retrieve second-language documents in Quiz Part 2 (either target or control) score more highly for those questions than those who do not. Note: This is intended to show that multilingual retrieval is helpful in the context of answering quiz questions Student: MultilingualityTest 2 Multilinguality - Test 2 Getting the right answer

Student: MultilingualityTest 2 result

Those retrieving documents in a non-course language score statistically the same for those questions as those who retrieved documents in their main course language. These results depend on the quantity and quality of information in different languages available within the repository being searched (or across the Internet for control group students). It is possible that students retrieving in a second-language document are less fluent in that language, hence have more difficulty assimilating the information within the document. In 328 cases (48%) the language of retrieval was English. It is also possible that students only searched in a second language when they had failed to find the answer in their course language. Student: Multilinguality

Multilinguality - test 3 Does our functionality aid multilingual retrieval? Student: MultilingualityTest 3

Student: MultilingualityTest 3 result

Student Scenario Qualitative evaluation of multilingual Search Multilingual search would be useful for my studies ? 71% (157 Students) STRONGLY AGREED 12% (25 Students) DISAGREED Lower results for students in countries where large collections of materials in their native language are available (EN, DE) or in countries were usually materials (at least for the particular domain of the test) are in English (NL) Student: MultilingualityOpinions

Multilinguality conclusions Students retrieved more non-course language documents when they had access to LT4eL functionality, but this did not result in higher quiz scores. LT4eL functionality helped retrieval of multilingual content - particularly Concept Browser, Definition Finder and Semantic Search Student: MultilingualityConclusions

Overall conclusions - Students Using Semantic Search, Concept Browser and Definition Finder to find answers, students score better than using plain text search Students most appreciate Definition Finder; they found it harder to see the benefit to them of Semantic Search and Concept Browser Students answered faster using LT4eL technologies than using full text search Learning took place in target and control group Students in the target group who used LT4eL functionality extensively, learned more than those who stuck to plain text search. Students retrieved more non-course language documents when they had access to LT4eL functionality, but this did not result in higher quiz scores. LT4eL functionality helped retrieval of multilingual content - particularly Concept Browser, Definition Finder and Semantic Search StudentsOverall conclusions

Personal Desktop experiments

Create classifications and organise their content within these sections Use LT4eL functionality to locate other resources in own and course languages. Organise these into structure Student tasks required them to: Evaluate the usefulness of these features for their study

Students were able to classify documents into topic sections within the personal desktop. Using the concepts and keywords linked to each document helped them do the classification Students were able to order documents within the Sections RESULTS

Student Scenario Multilingual support with the Personal Desktop Finding, classifying and ordering of multilingual documents within the personal desktop is possible using the NLP based functionality ? 62% (10% strongly) AGREED that they were able to classify and order documents in more than one language based on key concepts 75% (24% strongly) AGREED that they found documents in more than one language using concept and topics Student: MultilingualityOpinions

Students agreed that: Features would help students organise their study Multiple course modules Individual directed study Revision Research Having classified and ordered documents, they have a clearer understanding of the relationships between the topics

Tutor Hypotheses

Support for Teaching in a multi-lingual context - finding resources The addition of functionalities based on NLP increases the effectiveness of tutors in identifying supplementary resources to support students who are studying in a multi-lingual context. [Quant test] The addition of functionalities based on NLP increases the effectiveness of tutors in identifying documents that are relevant, reliable, and appropriate for level.. [Quant test]

Content Preparation Using the Keyword Extractor based on NLP increases the quality and consistency of keywords assigned to repository content and helps tutors who are expected to perform this task. [Quant test] Using the Keyword Extractor based on NLP enables tutors to produce a set of keywords faster than those tutors who perform the task unsupported [Quant test]

Basis of Tutor Scenario The tutor scenario focussed on: Multi-lingual search the LT4eL functions of Semantic search and ontology browsing The Keyword Generator for new content Both in English and in native language

Tutor scenario design

The tutor scenario was based on the need to find content to support students learning and to add additional content to the repository

Tutor tasks Internet Resources LMS resources Locate Content Group1 - ILIAS Group2 - LT4eL

Search Topics Students will learn which markup languages are appropriate for holding data. Data formats on the Web Students will be able to identify ways to construct links to internal and external resources. links in markup languages Students will be able to identify which protocols are required to transfer files over the internet file transfer and internet protocols DescriptionTopic Session 1 - Set A topics Students will learn some of the different picture formats & how to select an appropriate one for different purposes. picture formats on the Web Students will be able to identify how to present data in tabular form on web pages tables in markup language Students will be able to identify which protocols are used for sending & receiving s and internet protocols DescriptionTopic Session 2 -Set B topics

Add New Content Group1 - ILIAS Group2 – LT4eL

Tutor Scenario - Results Searching Multilinguality Keyword addition

Hypothesis: The addition of functionalities based on NLP increases the effectiveness of tutors in identifying documents that are relevant, reliable, and appropriate for level. Test within the same repository Use tutors' own assessments of relevance, reliability and appropriateness for level. Tutor: SearchingTest

Tutor: SearchingResults

Searching - conclusion LT4eL functionality makes no difference to the tutors' assessments of the Relevance, Reliability or Level of the documents found when searching within the repository. Tutor: SearchingConclusion

Hypothesis: The addition of functionalities based on NLP increases the effectiveness of tutors in identifying supplementary resources to support students who are studying in a multi- lingual context. i.e. tutors can find content in two languages Tutor: Multilingual

In more detail... Tutors are better able to find second language documents using LT4eL than using Internet search / plain ILIAS search in the same repository. Tutors find location of second language documents using LT4eL is relatively faster than using Internet searches / standard ILIAS. Tutors find location of second language documents using LT4eL is relatively easier than using Internet searches / standard ILIAS in the same repository. Tutor: MultilingualTests

Tutors are better able to find second language documents using LT4eL Tutor: MultilingualTest 1

Tutors find location of second language documents using LT4eL is relatively faster than using Internet searches / standard ILIAS Tutor: MultilingualTest 2

Tutors find location of second language documents using LT4eL is relatively easier than using Internet searches / standard ILIAS in the same repository. Tutor: MultilingualTest 3

Multilingual searches are more successful on the Internet than within repository. Within the repository, LT4eL functionality does not improve ability to find multilingual content Within the repository, when two documents are found, tutors using are more likely to find both documents together (same speed, same ease). Tutor: MultilingualConclusions

Keywords Using the Keyword Extractor based on NLP increases the quality and consistency of keywords assigned to repository content and helps tutors who are expected to perform this task Test: Tutors using the keyword generator produce a more consistent set of keywords than those who do not Tutor: KeywordsTest 1

Tutor: KeywordsTest 1 results

Keywords - test 2 Using the Keyword Extractor based on NLP enables tutors to produce a set of keywords faster than those tutors who perform the task unsupported. Tutors assessed their own time to complete the exercise: Tutor: KeywordsTest 2

Tutor: KeywordsTest 2 results

Keywords - test 3 Using keyword extractor gives: more consistent set of keywords faster results But are tutors using the extractor just assigning fewer keywords? Tutor: KeywordsTest 3

Tutor: KeywordsTest 3 results

Tutor: KeywordsOpinions base = 23 base = 27

Keywords - Conclusions Tutor: Keywords Using the keyword extractor yields a much more consistent set of keywords than manual Using the keyword extractor, tutors assign keywords 30% faster than not using it. About 90% of tutors said the keyword generator helped them Conclusions

Overall conclusions - Tutors Within the repository, LT4eL appears to make finding multilingual documents together slightly more likely (same speed, same ease). LT4eL functionality makes no difference to the Relevance, Reliability or Level of the documents found Using the keyword extractor yields a much more consistent set of keywords than manual Using the keyword extractor, tutors assign keywords 30% faster than not using it. TutorOverall Conclusions

Evaluation - Conclusions

WP5 Participation in Experiments

Participants in Student Scenario Total per group Romania Portugal Poland Netherlands Malta Germany Czechoslovakia 19 (23)2566 (*10)Bulgaria TotalGroup 4Group3Group 2Group1 8 languages and 226 students

Participants in Student Personal Desktop Activity TotalRomaniaPortugalPolandNetherlandsMaltaGermanCzechBulgaria 8 Languages and 120 students

Participants in Tutor Scenario 71Total 11Netherlands 12Romania11German 10Portugal10Czech 11Poland6Bulgaria 7 Languages and 71 tutors

Not a trivial task! 8 partners Pre-test, post-test, Quiz-A, Quiz-B, Desktop questionnaire, Tutor questionnaires - all translated into host languages (and verbatim answers translated back to English) 152 data files to download, clean, and load into database Analysis database contains: 18 tables, 585 different fields, 2483 records. The total number of non-blank data items is 66,725. Over 6,500 lines of code to manipulate, condition, load and analyse that data.

Overall Hypothesis The augmentation of e-learning systems (in this case ILIAS) with NLP and semantic web technologies increases the effectiveness of learning and teaching, and in particular, Increases the effectiveness of teachers and learners in locating relevant learning objects in the context of learning related tasks.